Algorithms of Oppression


Algorithms of Oppression: How Search Engines Reinforce Racism is a 2018 book by Safiya Umoja Noble in the fields of information science, machine learning, and human-computer interaction.
Noble is a professor at the University of California, Los Angeles in the Department of Information Studies. She is a Co-Director of the Critical Internet Inquiry Center and also works with African American Studies and Gender Studies. Her best-selling book, Algorithms Of Oppression, has been featured in the Los Angeles Review of Books, New York Public Library 2018 Best Books for Adults, and Bustle’s magazine 10 Books about Race to Read Instead of Asking a Person of Color to Explain Things to You. Her work markets the ways that digital media impacts issues of race, gender, culture, and technology.

Overview

Algorithms of Oppression is a text based on over six years of academic research on Google search algorithms. Noble argues that search algorithms become racist because they reflect the biases and values of the people who create them. These algorithms can then have negative biases against women of color and other marginalized populations, while also affecting Internet users in general by leading to "racial and gender profiling, misrepresentation, and even economic redlining." She mentions the issue of technological redlining, that profiles users.

Chapter Summaries

Chapter 1

In Chapter 1 of Algorithms of Oppression, Safiya Noble explores how Google search’s auto suggestion feature is demoralizing. On September 18, 2011 a mother googled “black girls” attempting to find fun activities to show her stepdaughter and nieces. To her surprise, the results encompassed websites and images of porn. This result encloses the data failures specific to people of color and women which Noble coins algorithmic oppression. Noble also adds that as a society we must have a feminist lens, with racial awareness to understand the “problematic positions about the benign instrumentality of technologies.”
Noble also discusses how Google can remove the human curation from the first page of results to eliminate any potential racial slurs or inappropriate imaging. Another example discussed in this text is a public dispute of the results that were returned when “jew” was searched on Google. The results included a number of anti-Semitic pages and Google claimed little ownership for the way it provided these identities. Google instead encouraged people to use “jews” or “Jewish people” and claimed the actions of White supremacist groups are out of Google’s control. Unless pages are unlawful, Google will allow its algorithm to continue to act without removing pages.
Noble reflects on AdWords which is Google's advertising tool and how this tool can add to the biases on Google. Adwords allows anyone to advertise on Google’s search pages and is highly customizable. First, Google ranks ads on relevance and then displays the ads on pages which is believes are relevant to the search query taking place. An advertiser can also set a maximum amount of money per day to spend on advertising. The more you spend on ads, the higher probability your ad will be closer to the top. Therefore, if an advertiser is passionate about his/her topic but is controversial it may be the first to appear on a Google search.

Critical reception

Critical reception for Algorithms of Oppression has been largely positive. In the Los Angeles Review of Books, Emily Drabinski writes, "What emerges from these pages is the sense that Google’s algorithms of oppression comprise just one of the hidden infrastructures that govern our daily lives, and that the others are likely just as hard-coded with white supremacy and misogyny as the one that Noble explores." In PopMatters, Hans Rollman describes writes that Algorithms of Oppression "demonstrate that search engines, and in particular Google, are not simply imperfect machines, but systems designed by humans in ways that replicate the power structures of the western countries where they are built, complete with all the sexism and racism that are built into those structures." In Booklist, reviewer Lesley Williams states, "Noble’s study should prompt some soul-searching about our reliance on commercial search engines and about digital social equity."
In early February 2018, Algorithms of Oppression received press attention when the official Twitter account for the Institute of Electrical and Electronics Engineers expressed criticism of the book, citing that the thesis of the text, based on the text of the book's official blurb on commercial sites, could not be reproduced. IEEE's outreach historian, Alexander Magoun, later revealed that he had not read the book, and issued an apology.