AI promises to have a transformative effect on journalism. It may disrupt the media landscape, while allowing for new kinds of experimentation in the newsroom.
It’s critical that journalists proactively identify how they can harness this technology to enhance their reporting and the media industry as a whole.
“In the last 25 years, many technological changes came to the journalism world [and] we missed the opportunity and revenue in the newsroom,” said Ali Tehrani, co-founder of NewsWyze, during an ICFJ Global Crisis Reporting Forum session. “[But] you cannot use off-the-shelf tools for quality journalism. You have to look at your needs and capacities, and create your own model based on what you are looking for.”
Tehrani was joined by Patrick Boehler, acting head of innovation and audience engagement at Radio Free Europe/Radio Liberty (RFE/RL), a U.S.-funded media outlet providing news coverage in 23 countries and in 27 languages across Europe and Asia. “There is a huge need for information as free press is under threat more and more across many of the countries we serve. [AI] gives us the opportunity to simplify our processes,” added Boehler.
Here’s what Tehrani and Boehler had to say about the promises and risks of generative AI in journalism:
A moment of opportunity
In 2018, NewsWyze and RFE/RL collaborated on an investigative project that involved millions of pages of documents in languages other than English that were difficult to analyze. In response, the two organizations developed an AI tool called GIST, which generates easy-to-read summaries of articles and audio content.
GIST was conceived in the pre-ChatGPT era, before AI had become the hot button topic it is today. “At the time, it was very difficult to get journalists to think about AI,” Tehrani recalled.
Falling under the umbrella of AI, natural language processing tools like GIST aim to equip computers to manipulate and reproduce human language. They shouldn’t be considered a substitute for human journalists, Tehrani noted. “While AI-assisted reporting is doable, AI reporting is not,” he said. “Human supervision cannot be removed from the journalistic process. What we appreciated from [RFE/RL] is that journalistic values are embedded in working with them, so it is something we also realized is important.”
AI models are only as good as their training, Tehrani further stressed. Having human journalists feed GIST quality examples of the type of summary articles desired enhanced the tool.
“It is important to ask how these AI models are generated, what data was put into them and what you want to get out of them. Whatever tools you create, you need designers, engineers and definitely journalists to make decisions,” said Tehrani.
Developing an AI tool takes time. GIST went through 13 iterations after being trained on 400 articles in English. The tool’s Russian and Spanish versions remain in development, and are being trained on more articles by the day. The GIST team also developed an audio-to-text summary function in English and Persian, but this is not yet public.
A moment of risk
AI also poses risks alongside its potential for positive innovation, cautioned Boehler. Misuse of generative AI, especially when it comes to newsroom processes, is a top-of-mind concern.
“A key question we have been asking ourselves is: is this [tool] an AI provider [or] is it a service using AI? And do we risk putting our trustworthiness at stake?” said Boehler.
RFE/RL has developed policies to address these concerns. For instance, the newsroom doesn’t allow AI-generated images to be used in its reporting. They are also developing tools to help better detect signs of authenticity in video content.
“Disinformation that is very articulate and sophisticated is rampant, and the cost of creating [it] has gone down significantly,” said Boehler. “Especially when you work in heavily polluted media ecosystems, you want to make sure you are a source of trusted news and information.”
Seizing the opportunity
Journalists must begin thinking of applications for AI in their newsroom now, Boehler urged. This includes taking into account the ethics behind the technology. “It’s really important that if you are in the newsroom, these [technological] developments are going to change the way you work and the way audiences consume content,” he said. “You have to have these conversations and reflect on ethical considerations, [too].”
Journalists should find ways to leverage AI to drive positive change, Tehrani added: “Journalists need to get individually and collectively involved in creating these AI tools. If we ignore these changes in technology, other people will do it. Newsrooms must train models in their own environment to benefit from them.”