News

Lessons from newsrooms who have, and are, prepping for ‘AI Elections’

2024-04-05. THE Election Year. The Year of AI elections. With over 60 countries heading to the polls this year, the world may look a little – or a lot – different by the end of 2024. How are newsrooms managing coverage and the significant challenge to sort fact from fiction?

AP Washington Bureau Chief Anna Johnson confers with members of the politics team on Super Tuesday, March 5, 2024, in Washington. (AP Photo/Jon Elswick)

by Lucinda Jordaan lucinda.jordaan@wan-ifra.org | April 5, 2024

‘The approach for the 2024 elections has been clear for us for the past year or more: to double down on our efforts to fight misinformation and disinformation, to promote media literacy, and to educate voters about the electoral processes so they can make informed choices.’
– Shelly Walia, The Quint

The choices voters make this year in over 64 countries who have elections will be defining – creating real pressure for journalists who are dealing with extreme polarisation, censorship and an unprecedented deluge of mis- and disinformation, much of it created and driven by AI.

To assess the scale and challenges, we spoke to editors from newsrooms in key countries having gone through, or preparing for, elections.

Pakistan’s shutdown

On 8 February, a deeply divided Pakistani electorate went to the polls – delayed by a year due to two years of political strife – with the internet and cellular access shut down, ostensibly due to terror threats. This had a destabilising effect on reportage and an impact on voter turnout.

While Dawn, the oldest English newspaper in Pakistan, “had access to some verification tools, the lack of connectivity on election day meant that most of these tools were rendered ineffective,” said Editor in Chief Zaffar Abbas.

“Even ‘human intelligence’ was not available in real-time as the newsroom had little to no contact with the reporters and correspondents in the field, and we could only receive information once a journalist physically arrived at their offices and established contact, using old-fashioned landlines.” 

“The debate and controversies surrounding the alleged pre-poll manipulations, and rigging of final results, is ongoing, and may not go away for a long time,” Abbas said.

 

Mis- and disinformation was rife, among almost all stakeholders, says Abbas, “including the major political parties, and various institutions of the government, intelligence services, or their proxies.” Particularly on social media, he added.

“Our biggest concern was the misuse of X (formerly Twitter) to spread wrong and misleading information,” he says, pointing to the sale of ‘verified’ accounts that put account credibility into question.

“In this election – more so than previous ones – the X platform became the primary forum for messaging and engagement by political parties and individuals who would not be covered by the mainstream media.”

With limited resources, and identifying the need to deploy more fact-checkers, Dawn partnered with the Karachi-based Centre for Excellence in Journalism (CEJ)’s Project iVerify.

“This allowed the editors to send tips/leads to the iVerify team, who were trained in fact-checking tools and software. We were able to republish their output, modify them according to our style, and use the content on our social media accounts. Through iVerify, if needed, the editors could get in touch with social media giants Meta and TikTok,” adds Abbas, who believes that: “Despite limited resources, we were able to provide our readers with pretty authentic and verified accounts of the election-related developments.”

Lessons learnt – need for mechanisms, tools when internet is down

Abbas reckons the biggest lesson learnt is “the need for developing mechanisms and tools to gather information when the internet is shut down in the country.”

And: “Now, we are also seriously considering having a dedicated team to check the flow of ‘mis/disinformation’ and ‘fake news,’ and hopefully will have a better system in place soon,” he adds.

SEE: Pakistan’s surprising and marred 2024 election, and what comes next

India: on high alert to #FightAgainstFake

Shelly Walia, Executive Editor of The Quint, has no illusions as to what the country is facing, with polls weeks away, and says The Quint has taken on an all-encompassing #FightAgainstFake campaign, with fact-checking as their “de facto mission.”

With a nearly 100-million strong electorate, the world’s most populous democracy faces a multi-phase election that will last 44 days (from 19 April to 1 June, with results on 4 June).

“This is only the second Indian general election that The Quint newsroom will cover in its nine years of existence. But the approach for the 2024 elections has been clear for us for the past year or more:  to double down on our efforts to fight misinformation and disinformation, to promote media literacy, and to educate voters about the electoral processes so they can make informed choices,” confirms Walia.

.    .   

Deepfake distortions

According to the World Economic Forum’s 2024 Global Risk Report, India leads in the list of nations that are vulnerable to disinformation. And Walia has a dedicated plan of action to counter this, spurred, in part, by events in Pakistan.

“Imran Khan’s voice was cloned during Pakistan elections, with his face superimposed onto an existing video. Such manipulated content, which can sway public opinion and influence voter sentiments, is bound to find its way into India, too,” reckons Walia – adding that AI deepfakes are becoming increasingly visible on social media feeds.

“What’s even more dangerous is the use of similar, synthetic content by political parties for campaigns. A lot of it might not be ‘malicious’ on the face of it – but they are being created to shape people’s perceptions.”

Watch: Artificial Intelligence and deepfakes takeover Pakistan elections

Webqoof, The Quint’s fact-checking team, certified by the International Fact-Checking Network (IFCN) has been around since 2018 – and The Quint has been preparing for the 2024 elections since mid-2023.

They have a series of immersive, multimedia How-To-Debunk guides in the works, which will cover Gen AI imagery, deepfakes, deepfake audio, and Gen AI text, notes Walia.  

“We are engaging in new fact-checking technologies to counter deepfakes, as well as working on investigative stories to expose the modus operandi of bad actors. Our recent investigations also revealed a nexus among Instagram pages that are amplifying fake videos,” she adds. 

See also: Ghost in the machine: Deepfake tools warp India election

India has a volatile election history, notes Walia.

‘In India, misleading information has led to large-scale violence, lynchings, and murders – and a learning for us from the past years is that ground reporting is the antidote to lies peddled on social media and other online platforms.’

The Quint is not only scaling up election coverage with ground reports and analyses, and maximising comprehensive coverage of issues, it is also planning a blitz of investigative stories, “exposing the bad actors working behind the scenes,” adds Walia.

They’re also actively debunking disinformation on closed messaging apps (notably WhatsApp and Telegram), and social media platforms.

Significantly, they’re also bridging the language gap by translating fact-check stories into various local languages, alongside English and Hindi.

“We are already translating fact-checks using our in-house AI translation service called SAGE – and have begun testing the tool by publishing fact-checks in Odia and Marathi,” explains Walia, adding that they will be collaborating with language experts “to maintain linguistic accuracy and cultural relevance, as well as forging partnerships with local media outlets to amplify our reach.”

The US: back to basics

In the US, where disinformation, fake news and distrust in the media was a feature of the 2020 elections – and the aftermath of not believing in results led to a violent insurrection on January 6 –  the Associated Press (AP) is doubling down on explanatory reporting “across formats, including video” to counter disinformation, explains Anna Johnson, AP’s Washington bureau chief.

“Democracy is really complicated in the United States, which is partly why disinformation has been effective. The AP is diving deep into explaining how elections work across the country – and doing so across formats and platforms in a variety of ways, including text stories, immersive digital storytelling, vertical videos for social and more in an effort to reach as many people as possible with factual information about how elections work.”

Anna Johnson, taken on Super Tuesday, 5 March 2024 in the Washington bureau of The Associated Press.(AP Photo/Jon Elswick)

They’re also expanding efforts around explaining how the AP declares winners in races and what goes into the race calls they make. “To combat the misinformation that flourishes around race calls, we must be transparent and explain how the AP has determined one candidate the winner of a race over another,” she says.

Since the 2020 election, the AP has beefed up explanatory reporting efforts, and launched other key initiatives, “including having a dedicated team that covers threats to democracy in the United States,” adds Johnson.

“We know we need to double down on our efforts to clearly explain how elections work and debunk misinformation around voting. That team is focused on several key coverage areas including challenges to the process of elections;  misinformation at the local, national and global level and how the impact of misinformation; threats to voting rights and the deepening political polarisation across the country,” she says.

Fact-checking is key to this, adds Johnson: “The AP is committed to fact checking and debunking misinformation at every level of our journalism. We fact check throughout our reporting and across formats, while also doing separate fact check stories that aim to reach audiences where they are, including on digital platforms. We also work to show the impact that misinformation and conspiracy theories have on people and their communities.”

See: How to Identify and Investigate AI Audio Deepfakes, a Major 2024 Election Threat

South Africa: on standby for sudden surprises

In South Africa, the SA National Editors Forum this week expressed outrage at “a lack of accountability and commitment by the interlocutors to serious electoral action to protect journalists online, limit hate speech, and promote authoritative information.”

This, after several attempts to engage with the top four Big Tech and Parliament’s Portfolio Committee on Communications, “to discuss joint action to combat disinformation and hate speech during the upcoming elections.” SANEF says it is being ghosted by the top four social media companies – despite the very real threat posed to election integrity.

Adriaan Basson, Editor in Chief of News24 (see more in our latest EDITOR TO EDITOR interview), believes political parties are the ones to watch. “I think the more insidious threat, at this point, are these political parties, specifically some of the new political parties, who make false or potentially inflammatory statements. 

“The difficult question is always how to cover this because you can’t ignore them, but you also don’t want to give them a platform to spread misinformation. We try not to give anyone an unfettered megaphone, but call out misinformation or disinformation, or fact check or, just not to actually publish it if it can’t be verified.”

News24 has various election projects on the go: “One of the first tools we built was our Manifesto Meter, in which we basically distil the big issues in each party’s manifesto to give our readers the ability to look at specific issues,” explains Basson.

“We also have a team on a rural roadshow, specifically to towns that generally have little or no media coverage outside of election season. We are also working with our engineering team to build election maps, mostly linked to the results. And we have a whole lot of reporters in the field.

News24 has also effected a new collaboration with the Atlantic Council’s Digital Forensic Research Lab (DFRLab): “They are very good at actively or pre-emptively identifying misinformation agents, or bots on social media. So we’ve asked them to identify for us some of these misinformation networks, and we will also then feed our findings to them,” explains Basson.

Taiwan: A case study in fact checking elections

Award-winning non-profit, the Taiwan Fact Check Centre (TFCC), is the first organisation of its kind, in the Chinese-speaking world, to get IFCN signatory.

“With less than 10 journalists we make around 60 articles per month, including fact check reports, dis/misinformation trend analysis, and interpretations. We also publish newsletters by weekly, one Chinese version and one English version, that wrap up our recent works,” says Eve Chiu, Editor-in-Chief, and CEO.

TFCC collaborates with Google, Facebook, Line, Yahoo, and has won the Best Correction/Impact Award of Global Fact 7, and the Best Media Literacy of Shi-Bai Teng Award 2021.

Here’s how they did it: with an early start

Spurred by an increase in sophisticated social media mis- and disinformation efforts during last presidential election, TFCC got to work well ahead of the 13 January elections.   

Since November 2023, they conducted a candidates’ public policies fact-checking project to check their statements during the campaign. 

they followed this up with a semi-real time fact checking of candidates for the TV debate on 30 December.

They also invited journalists on the frontline of daily news to attend training workshops on fact-checking know-how, says Chiu. 

“Since rumours jump from country to country, and within different languages, similar hoaxes will spread globally, therefore cross-border collaborations make fact checking more effective and more confident when an international event happens.”   

Tools of the trade

“My team uses OSINT (Open Sources Intelligence Techniques and Tools), like Google search-related functions, to do fact checking,” reveals Chiu. 

They also get regular support from Taiwan’s computer scientist community, “who are outstanding in digital instrument innovation – including an AI-made content detector that can verify AI-generated videos or photos, to some degree.”

Traditional journalism, says Chiu, is still the most important tool, especially when making a frontline decision.

“For example, shortly before the election, we couldn’t verify a Tik Tok post which claimed that the United States  supported DPP’s candidates William Lia and Bi-Khim Hsiao, but our journalism alerted us that it couldn’t be true, or main stream media would have covered such a major geopolitical claim if it was real,“ she says.   

Connecting the dots – at source

TFCC also had to debunk widespread rumours of massive  election fraud that spread on social media after the elections.

“We connected voting station workers, local election officials, a ballot monitor who represented different parties, and found no evidence of those rumours,” notes Chiu. 

TFCC’s public engagement efforts paid off: website traffic peaked on 14 January, the day after the election, and Chiu believes TFCC’s efforts “prevented a crisis of distrust in Taiwan’s democracy.” 

See also: Fact-checking’s impact on elections: A case study from Portugal

Share via
Copy link