Your Algorithm Hates You

By: David Lemayian | 05/31/2019
David Lemayian 2
Algorithms have bias baked into them, but users can do things to reclaim their digital space, says ICFJ Knight Fellow Lemayian (far left).

David Lemayian has been an ICFJ Knight Fellow since 2016 and works as the Chief Technologist for Code for Africa, an ICFJ partner. 

Some of the decisions algorithms make about our lives are fairly benign, such as those irresistible “Suggestions for you” on Netflix. But it gets far murkier when artificial intelligence (AI) and machine learning are used by businesses and governments for decision-making that affects our lives without us ever knowing about it. And worse, without us being able to appeal against those decisions.

These pieces of code are considered almost infallible by those using them. Banks and other lending institutions are determining your credit scores, companies and recruiters are considering whether to hire you, and your insurer is determining your premiums based on decisions made by AI.

And “when considering the role of algorithms in decision-making we need to think not only of cases where an algorithm is the complete and final arbiter of a decision process, but also the many cases where algorithms play a key role in shaping a decision process even when the final decision is made by humans.”

But many of these pieces of software have bias baked into them, what Joy Buolamwini, founder of the Algorithmic Justice League, calls “the coded gaze.” It’s a bias that perpetuates injustice, and also prompts a sense of fatalism  –  the view that we are powerless to do anything other than what we actually do in this AI-powered world.

“It is apparent that the ever-increasing use of algorithms to support decision-making, while providing opportunities for efficiency in practice, carries a great deal of risk relating to unfair or discriminatory outcomes.” Gender, race, tribe, and even location can result in a whole community not receiving benefits and then keep them at a disadvantage.

Putting it plainly, algorithms can be racist. As American Rep. Alexandria Ocasio-Cortez has said, algorithms “always have these racial inequities that get translated, because algorithms are still made by human beings, and those algorithms are still pegged to basic human assumptions. They’re just automated. And automated assumptions — if you don’t fix the bias, then you’re just automating the bias.”

It’s like a vicious AI knee-on-neck situation with the weight of the knee getting heavier with every run of its algorithm. And algorithms run a lot faster and affect a lot more people a lot more often than the Jim Crow laws ever did.

And this algorithmic bias, as Joy Buolamwini found out when going through a demo of a Hong Kong startup’s “social robot” that couldn’t detect her face, can “travel as quickly as it takes to download some files off of the internet.”

The startup used the same generic facial recognition software that she had previously used for her undergrad assignment at Georgia Tech, where she discovered that it didn’t work on her face— and she had to get her (white) roommate to stand in for her. At the time, on the other side of the world, she figured “someone else will fix it”. After going to Hong Kong, she knew that someone was going to have to be her. (For an excellent summary of how algorithm bias comes about, see Karen Hao’s article.)

What can we do about algorithmic bias? If you’re a software developer or data scientist, IBM Research has an open source toolkit that helps you check bias in your data models.

But it’s not just technologists who can do something about algorithmic bias. You can start reclaiming digital space by exploring your choice in technology services. For example, by using search engines like DuckDuckGo, because unlike the voracious data vampire that is Google, it doesn’t store your personal information to then use for targeted ads.

You can also petition and lobby your government to adopt a governance framework for algorithmic accountability and transparency policy where “Algorithmic literacy” is introduced into curricular, and standardised notifications (to communicate type and degree of algorithmic processing in decisions) are made a requirement.

Ultimately, we need to ask more of ourselves and tech companies. It’s not enough to just employ critical thinking – we also need to employ civic thinking in how we build and use these technologies.

This article first appeared in The Daily Maverick. 

News Category
Country/Region

Latest News

ICFJ se Suma a Otras 9 Organizaciones Internacionales Para Presentar un Amicus Curiae en el Caso del Periodista Guatemalteco Encarcelado José Rubén Zamora

Un grupo de 10 organizaciones internacionales presentó esta semana un amicus curiae ante la Corte Suprema de Justicia de Guatemala en el caso del periodista José Rubén Zamora Marroquín. El amicus, presentado el 26 de marzo, argumenta que el retorno de Zamora a prisión preventiva constituye una violación de sus derechos fundamentales bajo el derecho guatemalteco e internacional, e insta a la Corte a otorgar un recurso de amparo pendiente y permitir que el Sr. Zamora lleve su proceso bajo medidas sustitutivas.

ICFJ Joins 9 Other International Organizations in Submitting Amicus Brief in Case of Imprisoned Guatemalan Journalist José Rubén Zamora

A group of 10 international organizations submitted an amicus curiae brief to Guatemala’s Supreme Court in the case of journalist José Rubén Zamora Marroquín. The brief, filed on March 26, argues that Zamora’s return to preventive detention constitutes a violation of his fundamental rights under Guatemalan and international law, and urges the Court to grant a pending amparo appeal and allow Zamora to return to house arrest.

Press Freedom on Campus: Why it Matters and What Student Journalists Need Most

In February, ICFJ partnered with Vanderbilt University and Freedom Forum, with support from the Lumina Foundation, to bring university faculty and students, international and U.S. journalists, and media leaders together for a discussion about the importance of press freedom in democratic societies. Participants stressed how U.S.-based journalists can learn from the experiences of their colleagues abroad, and they dived into the challenges campus reporters face, such as a lack of funding and harassment from fellow students, offered advice for engaging audiences, and more.