Understanding Deepfakes and How to Counter Them

By: Mya Zepp | 03/20/2023

From Snapchat face swap filters to U.S. President Joe Biden singing Baby Shark, manipulated media has proliferated in recent years. Deepfakes and other forms of synthetic and AI-assisted media manipulation are on the rise, and journalists tasked with sorting fact from fiction are forced to keep pace.

In a recent ICFJ Empowering the Truth Global Summit session, shirin anlen, a media technologist at WITNESS. WITNESS, a nonprofit that helps people use video and technology in support of human rights, runs a project called “Prepare, Don’t Panic,” focused on countering the malicious use of deepfakes and synthetic media, among other initiatives.
 


anlen offered tips for how journalists can better understand the threats posed by manipulated media, and what can be done to counter them. Here are some key takeaways from the session.

Technologies and their threats

Rapidly evolving technologies are allowing users to edit objects and facial features, animate portraits, transfer motion, clone voices, and more. As part of this ecosystem, deepfakes are a type of audio-visual manipulation that allows users to create realistic simulations of people's faces, voices and actions.

Deepfakes produced today are used in alarming ways. They have promoted gender-based violence, for instance, through the nonconsensual posting of sexual images and videos using a person’s likeness. Falsified videos of public figures have also been disseminated, as have fabricated audio clips. Deepfakes also benefit from the “Liar’s Dividend,” which places extra burden on journalists and fact-checkers to prove a piece of media’s authenticity, or lack thereof.

They are the most widely discussed form of manipulated media, anlen noted: “Deepfake itself is part of a larger generating landscape that we keep seeing more and more in the news.”

Although deepfakes are becoming more prevalent, they aren’t as popular as one might be led to believe. They require a significant amount of skill and knowledge in order to execute properly, making them difficult for the average person to create. A lot of manipulated media, as a result, doesn’t quite rise to the level of a true deepfake.

Filters that change a person's hair,eye color or voice, for instance, are lesser manipulations called “shallow fakes” that people may come across on a daily basis, especially on social media. AI-generated voice clips of made-up quotes from public figures are another example of shallow fakes.

“It's not really been used on a large scale,” anlen said about deepfakes. “Most of what we are still seeing in the media misinformation and disinformation landscape [are] shallow fakes, which are mostly contextual recycled materials.”

Detection

Every new technology has an Achilles’ heel and deepfakes are no exception. Users can detect errors in appearance, for instance: generated images may have static in the background, teeth out of alignment, or the AI-generated human in a video might not speak without the words properly matching mouth movements.

The technology adapts rapidly, however. “Research came out and said, ‘deepfakes don't blink, so it will be really easy to detect because it just doesn't blink,’ and then two weeks later, deepfakes started to blink,” said anlen. 

In a virtual cat-and-mouse game with manipulated media that has become higher quality and easier to access, detection efforts struggle to keep up.

“The first generation of fake faces all had eyes in the center. They were always in the center, so that's what the detection was looking for,” said anlen. “But now we have so many different variations of people that are being generated with different lighting, different expressions – and the eyes are not in the center anymore.”

There are also gaps in who has access to quality detection tools, anlen explained. While there are websites that anyone can access, these tools tend to be less effective.. Only a select few have access to accurate detection tools available to deepfake experts.

Solutions

Among methods to spot deepfakes, journalists can review video content for glitches and distortions, apply existing verification and forensic techniques, and use AI-based approaches to spot deepfakes when available. 

An increase in media literacy tools and more training on manipulated media for journalists are also essential. 

“We need to prepare for it and we need to see it,” said anlen. “We need to understand the landscape in order to really shape the technology, shape how it's supposed to be built, how it's supposed to be regulated and be part of it – and not just be affected by it.”

Disarming Disinformation is run by ICFJ with lead funding from the Scripps Howard Foundation, an affiliated organization with the Scripps Howard Fund, which supports The E.W. Scripps Company’s charitable efforts. The three-year project will empower journalists and journalism students to fight disinformation in the news media.

Latest News

ICFJ se Suma a Otras 9 Organizaciones Internacionales Para Presentar un Amicus Curiae en el Caso del Periodista Guatemalteco Encarcelado José Rubén Zamora

Un grupo de 10 organizaciones internacionales presentó esta semana un amicus curiae ante la Corte Suprema de Justicia de Guatemala en el caso del periodista José Rubén Zamora Marroquín. El amicus, presentado el 26 de marzo, argumenta que el retorno de Zamora a prisión preventiva constituye una violación de sus derechos fundamentales bajo el derecho guatemalteco e internacional, e insta a la Corte a otorgar un recurso de amparo pendiente y permitir que el Sr. Zamora lleve su proceso bajo medidas sustitutivas.

ICFJ Joins 9 Other International Organizations in Submitting Amicus Brief in Case of Imprisoned Guatemalan Journalist José Rubén Zamora

A group of 10 international organizations submitted an amicus curiae brief to Guatemala’s Supreme Court in the case of journalist José Rubén Zamora Marroquín. The brief, filed on March 26, argues that Zamora’s return to preventive detention constitutes a violation of his fundamental rights under Guatemalan and international law, and urges the Court to grant a pending amparo appeal and allow Zamora to return to house arrest.

Press Freedom on Campus: Why it Matters and What Student Journalists Need Most

In February, ICFJ partnered with Vanderbilt University and Freedom Forum, with support from the Lumina Foundation, to bring university faculty and students, international and U.S. journalists, and media leaders together for a discussion about the importance of press freedom in democratic societies. Participants stressed how U.S.-based journalists can learn from the experiences of their colleagues abroad, and they dived into the challenges campus reporters face, such as a lack of funding and harassment from fellow students, offered advice for engaging audiences, and more.