News

How Schibsted and Mediacorp are deploying AI in the newsroom

2024-02-22. As AI becomes mainstream in everyday work, there are many editorial opportunities for newsrooms to discover. However, incorporating AI in your workflow can seem daunting. How does a newsroom begin this integration?

Credit: WAN-IFRA

by WAN-IFRA External Contributor info@wan-ifra.org | February 22, 2024

By Gwenneth Teo and Neha Gupta

“‘Is AI a potential replacement for people in the newsroom? Are you thinking about it as a way to augment your existing work, or is it a little bit of both?’ This is a fundamental ethical question for all newsrooms to ask themselves,” said Fergus Bell, Founder and CEO, Fathm, during WAN-IFRA’s most recent Digital Media Asia conference.

Bell’s primary concern is that the design of an AI product must align with the company’s business strategy. He recommended that a company should begin by identifying its pain points and explore AI tools for potential solutions.

Optimising workflow at Mediacorp with AI

Wang Yin is the Assistant Lead of the AI Strategy and Solutions team of Mediacorp News, Singapore. His team hoped to use AI to reduce the time spent on mundane tasks. 

“We wanted AI to improve efficiency and efficacy, as well as build new capabilities, making it an assistant and not a replacement,” said Yin.

Consequently, Mediacorp implemented AI Weather – a software that can analyse data and produce forecast clips.

Previously, three production teams worked on each clip, producing 11 forecast clips daily across multiple languages. Now, with the implementation of AI bots, the workflow has streamlined significantly.

These bots efficiently manage tasks including data analysis, audio and video editing, and quality control. This shift has allowed Mediacorp journalists to transition into roles where they can concentrate on producing more creative content. 

“Short term, it would be simple to replace a pool of content with AI-generated content. However, long term, it may not deliver the same value as human-generated content,” said Bell.

In the Mediacorp newsroom, AI was applied across three categories – workflow automation, content creation, and verification. 

Yin discussed the three critical stages of AI life cycle management: 

  • Design: Where solutions are created to tackle real world problems in alignment with business strategies. It is important to assess the readiness of the AI infrastructure at this stage. 
  • Develop: Turning blueprints into prototypes. 
  • Deploy: Where solutions are turned into production.

“Newsrooms must continue to fine-tune AI products even after deployment,” Bell said.

Creating an AI framework at Schibsted

Another newsroom that is taking AI seriously is Schibsted, the largest media group in the Nordics. 

In 2019, Schibsted employees collaborated across disciplines to develop the company’s AI strategy. Recognising the importance of upskilling and comprehending AI’s potential and risks, they established the Schibsted AI Academy, which has since seen more than 600 staff registrations.

The FAST (Fairness, Accountability, Sustainability and Transparency) framework was created for Schibsted to address its unique AI-related challenges. This framework guided its AI development and held staff accountable for its representation.

Read also: The 5 factors driving Schibsted’s digital transformation

In an effort to strategically plan for the future, the company has extended its focus beyond newsrooms, actively involving audiences – both current and potential – in discussions about the integration of AI into the news they depend on.

While it was challenging, “it was rewarding to hear different perspectives of news media and adopt them into existing ideas to create new and refreshing ones,” said Agnes Stenbom, head of IN/LAB at Schibsted, Sweden. 

“Our use of AI will be the best it can be if we also excel at managing the downsides of the technologies. So, we make sure to position AI risk assessment and ethics as something very closely related to product development and success,” she said.

Stenbom recommended the staff should be involved throughout the AI process, not just when an AI product is released. AI projects should also be interdisciplinary to gather different perspectives. 

“To build a sense of cultural ownership and community of AI in the company, a spectrum of stakeholders are engaged to ideate. Hackathons and internal AI events are held to build enthusiasm and knowledge, regardless of prior AI expertise. These events have become the source of successful AI product ideations,” she said.

Schibsted is currently using AI for short-form article summaries, synthetic voice clones (text-to-speech) and transcription (speech-to-tech)

Read also: News as music: Schibsted explores AI-based innovation to attract GenZ

The language and ethics of AI

Bell stressed the importance of establishing a common AI vocabulary that can be understood across the industry. Failure to do so can spark confusion and jeopardise the relationship between the newsroom and its audience. 

“We cannot afford for AI to be the new way that we lose trust… If you don’t know how a language model works, how can you use it properly? And how can you explain how you’re using it to your audience? Training staff can prove to be a key tool here,” he said, adding, “Newsrooms should not be gatekeeping AI technology; transparency is of utmost importance.”

Incorporating AI into newsrooms also gives rise to ethical questions, as articulated by Bell: 

  • As journalists enter prompts into AI software, who should receive the credit for inventing the prompt? 
  • How can AI software be de-biased? 
  • If AI makes a mistake, like making a racist remark, who is responsible? 
  • Should newsrooms consider the environmental impact of AI?

Bell highlighted three key concerns companies should consider: 

  • Newsrooms need support as they build guidelines and ethical codes for AI usage.
  • A common language for AI needs to be established among journalists. Though time consuming and labour intensive, it will ensure access and equitability for all. 
  • A structure needs to be created so that everyone – from AI developers to AI users – has a seat at the table.

About the author: Gwenneth Teo is a Communication Studies student at NTU’s Wee Kim Wee School of Information and Communication.

WAN-IFRA External Contributor

info@wan-ifra.org

Share via
Copy link