Embracing AI: How to Stay Ahead of the Curve

By: Aurora Martinez | 03/18/2024

Generative artificial intelligence (GenAI) may fundamentally change how people work. 

In some ways, it already has – and the technology is rapidly evolving. Take for instance, the emergence of multimodal AI models, which can simultaneously combine images, text and speech to generate new content.

Concerns around GenAI notwithstanding – its potential to spread mis- and disinformation, and fears it could replace journalism jobs, among them – media leaders should consider how to use the technology to uplift their work. If utilized responsibly and ethically, AI can improve how news is produced and consumed.

“The future of AI — really we're just getting started on what is possible,” said Nikita Roy, an ICFJ Knight fellow and host of Newsroom Robots, a podcast featuring conversations with leaders in AI and journalism. “Things are getting exponentially better almost every week at this point in time.”

By approaching AI with an open mind and dedication to understanding its uses, journalists can better share valuable information with audiences and remain competitive in the ever-evolving media landscape, Roy said. 

The following are tips for journalists to stay ahead of the AI curve:
 

Harnessing the benefits of AI

AI is not here to replace the role journalists play in their communities, Roy said. Instead, reporters can leverage the technology to optimize productivity, and make their news coverage more profound, engaging and dynamic.

“How can [AI] support the journalists in your newsroom to produce more interesting work? How can you create new news experiences with AI? How can you reimagine news with AI?” These are all questions journalists should be asking themselves now, Roy advised.

AI can help journalists carry out projects that wouldn’t have been possible otherwise. This can be especially helpful in smaller newsrooms with less capacity. For example, journalists can use natural language prompts in ChatGPT-4 to analyze large data sets, visualize information and identify trends, all of which could lead to story ideas.

Future innovations could spur even more benefits. Though still not available to the public, in February, OpenAI revealed Sora, a new tool that promises to create video from text in a matter of seconds. Sora could assist visual storytelling, but like other GenAI tools, it has sparked concerns about its possible negative consequences, such as whether it will perpetuate biases

Journalists shouldn’t immediately jump to the negative aspects of new AI tools, however, urged Emilse Garzón, a digital journalist and professor specialized in AI and cybersecurity in Argentina: “We have to talk about the consequences without being catastrophic when it doesn’t imply a real catastrophe, and not [catastrophizing] because of clickbait.”

Newsroom leaders should encourage discussions on GenAI across teams, Roy said. Organization-wide, team-specific training can help build institutional knowledge and improve workflows. 

Creating a Slack channel to share “AI wins” can help further encourage stronger AI practices, Roy suggested: “You’re creating a culture where people can be transparent and open about how they have been using generative AI if it falls within the guidelines.” 

Utilizing AI ethically

Transparent, responsible utilization of GenAI tools will promote its ethical use by others. “Technology doesn’t harm us,” Garzón said. “What harms us is how it’s used, who creates it and who endorses its use.” On an individual level, if Garzón uses GenAI tools like Midjourney or Canva’s AI-powered apps to generate images from text, she discloses this information with her audience.

Newsrooms, Roy urged, should implement AI editorial guidelines to standardize best practices. Having rules in place can push back on taboos around using AI tools, make clear how people can use them, and address ethical concerns.

Guidelines should stress the importance of human involvement in GenAI-assisted content, including fact-checking before publication, and a zero tolerance policy for plagiarism. AI should never be used to create stories that go straight to publishing, Roy warned. “Everything has a right and a wrong use,” she said. “We have to understand how to use these [AI] tools in the right way."

It’s up to individual newsrooms to dictate the scope of AI use among their staff: while the AP does not allow ChatGPT to be used in any published content, other newsrooms may only restrict using GenAI to author original reporting.

"Those guidelines don't have to be set in stone,” Roy said. They can be revised periodically based on how new technologies evolve and what each organization needs.

Fostering dialogue around AI

Journalists should be at the forefront of the many discussions being had around AI today, Garzón said. 

Offering tips on how AI can be used to tackle disinformation, and diving into the complexities of why people should care about copyrights, data protection, and who is behind AI-driven technologies are all important discussions that reporters can drive through their coverage. 

Journalists should offer a variety of perspectives and nuance in their reporting on the topic, and not just focus on the positives or negatives, Garzón added. Their duty is to inform the public – not tell them what to think about it. “We ought to have among us those debates that sometimes escape and go beyond our idea of ​​the editorial line,” she said. “What we have to ensure is that there is diversity of opinions, even if we don’t like those opinions.”

Through their exploration of AI technologies, journalists can also identify where GenAI is producing misleading information or biased outputs. This can promote a deeper understanding of how AI models work and could be improved on, Roy said.

“We need to be informed citizens,” Roy said. “It is expected and important to do extensive research and contextualize stories about the development and evolution of today’s AI to show respect for the people who trust us.”

News Category
Country/Region

Latest News

ICFJ-Backed Reporting Teams Are Probing the Sources Behind Election Lies

Investigative reporting teams across four continents are working with ICFJ’s support to expose the sources and money behind electoral disinformation campaigns, in a pivotal year for democracy when more than 2.6 billion people are expected to go to the polls.

Refusing to Be Silenced: The Importance of Exiled Media

Today, 71 percent of people live in countries that are considered autocratic. That’s up from 48 percent just a decade ago. The independent research institute at the University of Gothenburg in Sweden that published these figures also found that nearly four dozen more countries are “autocratizing.”

The implications of this are profound. In the most oppressive autocracies, freedom of expression, freedom of association, free and fair elections and other democratic values are absent. In others, they may be present in part but insufficient.

The Journalists Behind Afghan Fact Share How They Counter Disinformation

At the end of 2022, an Afghan journalist sent his colleagues an IJNet Persian article on fact-checking and verification. The piece came with a recommendation: that they should launch a website focused on fact-checking in Afghanistan.