News

Generative AI set to make humanity in journalism ‘a premium feature’

2023-08-03. Dealing with new AI tools should start with assessing the human element in your journalism, says Ezra Eeman: “That’s the first question you should take back to your newsroom: What’s the human aspect of our work? How can we amplify that? How can we make that even more visible?”

(From L-R) Moderator Josh Quittner, CEO and co-founder of Decrypt Media Inc., Ezra Eeman, Director of Strategy & Innovation at NPO, Laura Ellis, Head of Technology Forecasting at BBC, Tav Klitgaard, CEO of Zetland, and Agnes Stenbom, Head of IN/LAB at Schibsted.

by Teemu Henriksson teemu.henriksson@wan-ifra.org | August 3, 2023

Amid all the doubts and concerns about generative AI, the key point to remember is that the technology “is not going to go away” and news publishers have to to find ways to deal with it, said Laura Ellis, Head of Technology Forecasting at BBC.

“We are intelligent human beings who have solved things before, and we have to make sure that we regulate, that we gather together, that we have conversations … and that we make this work for us,” she said.

“It could be brilliant. It could be awful. It’s our responsibility to make sure it’s brilliant.”

Ellis was one of the panellists in a session called “Generative AI – Media saviour or existential threat?” at WAN-IFRA’s World News Media Congress 2023.

As many have noted, one area where AI tools are already proving beneficial is in reducing the grunt work that takes up a lot of journalists’ time and energy, such as transcribing.

“If we can make that go away, then we can focus our creativity on things that actually matter and that our journalists are trained to do,” said Tav Klitgaard, CEO of Zetland.

Beyond productivity gains, it’s also the creative implications of generative AI that are highly promising, said Ezra Eeman, Director of Strategy & Innovation at NPO. AI tools could help create innovative journalism formats (such as chatbot interactions) and other entirely new use cases that will emerge in the near future.

See also: Our interview with Ezra Eeman: ‘There’s not enough strategic thinking about the impact AI will have on business models’

In terms of formats, the technology could also be used to give users more control over how they prefer to interact with news content, said Agnes Stenbom, Head of IN/LAB at Schibsted. For example, long-form text could be automatically converted into summarised bullet points or audio, giving users the choice of how they want to consume journalism.

This type of flexibility will also help make news content more accessible to more users, especially at a time when there is a widening gap between those who consume high-quality news and those who don’t, she said.

See also: News as music: Schibsted explores AI-based innovation to attract GenZ

‘Human touch’ to become crucial

Given the ease with which AI tools can create text and images, many fear they will provoke an overwhelming influx of automated content that will flood the internet. A Europol report, for example, discussed the possibility that 90% of online content could be synthetically generated by 2026.

In a world of mass AI-generated content, the human aspect of content production could gain significant value.

“I think there’s a big opportunity in standing out from generated content with content made by humans,” said Klitgaard. “It’s going to be a premium feature. That’s the bet we’re making at Zetland.”

Publishers should focus on making sure there is a strong “human touch” to their journalism, he said: “What we bring to the table is fact-checking, basically the good old journalistic deeds, and then we combine that with personal storytelling.”

Still, there will be a place for automated journalism, as successful experiments with using AI for sports, financial or weather news have shown.

“There will be news that is perfectly appropriate to be delivered” through automated content, Ellis said. But an AI scraping things off the internet will not be able to replicate human experiences.

“I want to know that if I’m hearing a story told that somebody has been there, that they’ve been hot and tired and scared … When it’s a story that has come from the blood, sweat and tears of a proper human being who sat and watched something in a terrifying, difficult set of situations at times. To me, that is infinitely valuable,” she said.

Eeman also encouraged publishers to approach AI by considering the human element in their journalism: “That’s the first question you should take back to your newsroom: What’s the human aspect of our work? How can we amplify that? How can we make that even more visible?”

Close relationship with readers key

If we are moving towards a situation where the web is flooded with automated content, what will be the impact on the business models built around human-produced journalism?

“I think we’ll see a future where there’s more of two types of content. There will be more automated content … But I also think there will be more human-made content, deeply empathic content that puts human lived experiences front and centre,” Stenbom said.

Some fear that if cheaply produced AI-powered sites remain accessible to all, while the human-made journalism is locked behind paywalls for paying customers, “we only give the privileged access to those stories,” Stenbom said.

“I worry what that type of future will lead to in terms of cementing social class and differences in society. I think it’s very important that we don’t leave AI-generated content for the masses with low willingness or ability to pay, but that we try to make sure that this human-made news is available broadly in society,” she said.

To differentiate themselves from the competition with automated content, Klitgaar stressed the need for publishers to build stronger relationships with readers by rethinking their product offering to include things like community building and brand trust: “If you want to be a destination platform in the future, you need to be more than the sum of your articles.”

“That means that even if you find a similar version of a Zetland article on some other website, you will want to get it from Zetland because there is more to that product than the sum of the articles,” he said.

Dealing with the dark side of AI

Finally, the panellists discussed the impact automated content will have on mis- and disinformation, given that it will only become easier to generate fake content that looks authentic.

Even people familiar with these new technologies are not immune, said Ellis: ”I fell for the Pope in a puffer jacket. I work in this area, and I thought, he looks great!”

She emphasised the urgent need to teach media literacy and critical thinking in schools – “We’re going to have to get better at this as a society” – and said she was “obsessed” with media provenance, which refers to technical solutions that help authenticate online content by providing information about its source.

Media provenance solutions are “not a silver bullet, but it does help if you’re the BBC and you have your content constantly spoofed, with people putting out stuff with your brand on and saying, this is a BBC piece. We have very little defence against that,” she said.

Transparency can also be “a real hedge against some of the disaster” that could come, she added.

“So if we can be transparent about how something was made, where it came from, who shot it … Younger audiences are obsessed with who created stuff, and that’s brilliant.”

Teemu Henriksson

Research Editor

teemu.henriksson@wan-ifra.org

Share via
Copy link