ŷ

Jump to ratings and reviews
Rate this book

Algorithms of Oppression: How Search Engines Reinforce Racism

Rate this book
Run a Google search for "black girls" - what will you find? "Big Booty" and other sexually explicit terms are likely to come up as top search terms. But, if you type in "white girls," the results are radically different. The suggested porn sites and un-moderated discussions about "why black women are so sassy" or "why black women are so angry" presents a disturbing portrait of black womanhood in modern society.

In Algorithms of Oppression, Safiya Umoja Noble challenges the idea that search engines like Google offer an equal playing field for all forms of ideas, identities, and activities. Data discrimination is a real social problem; Noble argues that the combination of private interests in promoting certain sites, along with the monopoly status of a relatively small number of Internet search engines, leads to a biased set of search algorithms that privilege whiteness and discriminate against people of color, specifically women of color.

Through an analysis of textual and media searches as well as extensive research on paid online advertising, Noble exposes a culture of racism and sexism in the way discoverability is created online. As search engines and their related companies grow in importance - operating as a source for email, a major vehicle for primary and secondary school learning, and beyond - understanding and reversing these disquieting trends and discriminatory practices is of utmost importance.

An original, surprising and, at times, disturbing account of bias on the internet, Algorithms of Oppression contributes to our understanding of how racism is created, maintained, and disseminated in the 21st century.

248 pages, Paperback

First published February 20, 2018

880 people are currently reading
30.2k people want to read

About the author

Safiya Umoja Noble

2books165followers
In the Fall of 2017, Dr. Safiya Umoja Noble joined the faculty of the University of Southern California (USC) Annenberg School of Communication. Previously, she was an assistant professor in the Department of Information Studies in the Graduate School of Education and Information Studies at UCLA where she held appointments in the Departments of African American Studies, Gender Studies, and Education. She is a partner in Stratelligence, a firm that specializes in research on information and data science challenges, and is a co-founder of the Information Ethics & Equity Institute, which provides training for organizations committed to transforming their information management practices toward more just, ethical, and equitable outcomes. She is the recipient of a Hellman Fellowship and the UCLA Early Career Award.

Noble’s academic research focuses on the design of digital media platforms on the internet and their impact on society. Her work is both sociological and interdisciplinary, marking the ways that digital media impacts and intersects with issues of race, gender, culture, and technology design. She currently serves as an Associate Editor for the Journal of Critical Library and Information Studies. Safiya holds a PhD and M.S. in Library & Information Science from the University of Illinois at Urbana-Champaign, and a BA in Sociology from California State University, Fresno with an emphasis on African American/Ethnic Studies.

Research and Scholarly Interests:
- Search engine ethics
- Racial and gender bias in algorithms
- Technological redlining
- Socio-cultural, economic and ethical implications of information in society
- Race, gender and sexuality in information communication technologies
- Digital technology and Internet policy development
- Privacy and surveillance
- Information and/as control
- Critical information studies

Ratings & Reviews

What do you think?
Rate this book

Friends & Following

Create a free account to discover what your friends think of this book!

Community Reviews

5 stars
1,056 (28%)
4 stars
1,535 (40%)
3 stars
883 (23%)
2 stars
215 (5%)
1 star
55 (1%)
Displaying 1 - 30 of 533 reviews
Profile Image for Mario the lone bookwolf.
805 reviews5,297 followers
January 18, 2020
The master algorithm seems to give its name an inglorious connotation.

WEIRD comes to mind. White, Educated, Industrialized, Rich and Democratized. This problem of many humanities has also made it into the coding of algorithms. Previously in history, the problem was that many foundations for research used a too small and homogeneous pool of people and most of the study was done by white, male and wealthy people, so that their results were not representative for the whole population.

And that in two ways. On the one hand, prejudices, when they were not yet politically incorrect, flowed directly into pseudo-research. As emancipation and equality spread, it were only the indirect, unvoiced personal opinions. But the research leaders, professors and chief executives incorporated their conscious and unconscious worldviews into the design of questions, research subjects, experimental methods and so on.

Secondly, these foundations for the research were presented to an equally biased audience as well as test subjects. For a long time, much of the research has been based on these 2 foundations and is it indirectly until today, because new research builds on older results. It is like trying to finally remove a bug in the source code that evolved over the years with the bug as a central element of the whole system. That alone is not the only severe problem, but the thousands of ramifications it left behind in all other updates and versions. Like cancer, it has spread everywhere. This makes it very expensive to impossible to remove all these mistakes again. A revision would be very elaborate and associated with the resistance of many academics who assume dangers for their reputation or even whole work. They would see their field of research and their special areas under attack because nobody likes criticism and that would be a hard one to swallow.

What does this have to do with the algorithms? The programming of software is in the same hands as in the example above. Certainly not in such dimensions, but subconsciously inadvertent opinions may flow into it and the way search engines generate results is even more problematic. Especially in the last few years, deep learning, AIs, big data, and GANs, Generative Adversarial Networks, have been much more integrated into the development so that the old prejudices could begin evolving in the machines themselves without extra human influence.

This means that, in principle, no one can say anymore decidedly how the AIs come to their conclusions. The complexity is so high that even groups of specialists can only try timid approaches to reverse engineering. How precisely the AI has become racist, sexist or homophobic cannot be said anymore and worse, it can not be quickly repaired in hindsight. Because the reaction patterns on a search input cannot be selected in advance.

There is a sad explanation: Unfortunately, people are often infested with low instincts and false, destructive mentalities. When millions of people have been focusing their internet activity on aggressive hostility for decades, the algorithm learns to recognize their desires. There is a lot of money to earn and the AI should provide the users with what they want. The market forces determine the actions of the Internet giants and these give the public what it craves for. Ethics and individualized advertising can hardly follow the same goals. This is, even more, the case for news, media and publishers who suffer from the same problems as the algorithms. With the difference that they play irrelevant, rhetorical games to distract from the system-inherent dysfunctions.

The same problem exists with the automatic proposal function of online trade, which can inadvertently promote dangerous or aggressive behavior. Or with the spread of more extremist videos by showing new and similar ones, that are automatically proposed, allowing people to radicalize faster. The AI does its job, no matter what is searched for.

On a small scale, the dilemma has already been seen with language assistants and artificial intelligence, degenerating in free interaction with humans. For example, Microsoft's intelligent, self-learning chatbot, which was transformed into a hate-filled misanthrope by trolls within days. It is not difficult to imagine the dimension of the problem with the far-spread of new technologies.

One of the ways to repair these malfunctions is time. When people reset the AI's by doing neutral and regular searches. That's too optimistic, so it's more likely we will have to find a technical solution before people get more rational. For both better alternatives, the search queries and the results would have to change significantly before there could be the beginning of positive development.

The academic search results should not be underestimated. These often false, non-scientific foundations on which many of the established sciences stand. Even if the users became reasonable, there would still be millions of nonsensical results and literature. These fake, antiquated buildings of thought, on which many foundations of modern society are based, must be the primary objective. The results are the symptoms, but those dangerous and wrong thinkings are the disease. The long-unresolved history behind it with all its injustices has to be reappraised because it is the reason for widespread poverty and ignorance, which has its roots in wrong social models so that innocent AIs get deluded by search requests.

And the effects of search input and search results are mutually reinforcing. The mirror that they hold for society testifies only to hidden prejudices. However, those feel saver in their secret corners because they are supposedly unrecognized and subtly stoked by populists additionally and for their benefit. When wrong thinking has buried itself so deeply into a society, it also becomes part of all the products of that culture.

A wiki walk can be as refreshing to the mind as a walk through nature in this completely overrated real-life outside books:



Profile Image for Kara Babcock.
2,069 reviews1,540 followers
February 27, 2018
So you read So You Want to Talk About Race and now you have more questions. Specifically, you’re wondering how privilege affects your life online. Surely the Internet is the libertarian cyber-utopia we were all promised, right? It’s totally free of bias and discrimina—sorry, I can’t even write that with a straight face.

Of course the Internet is a flaming cesspool of racism and misogyny. We can’t have good things.

What Safiya Umoja Noble sets out to do in Algorithms of Oppression: How Search Engines Reinforce Racism is explore exactly what it is that Google and related companies are doing that does or does not reinforce discriminatory attitudes and perspectives in our society. Thanks to NetGalley and New York UP for the eARC (although the formatting was a bit messed up, argh). Noble eloquently lays out the argument for why technology, and in this case, the algorithms that determine what websites show up in your search results, is not a neutral force.

This is a topic that has interested me for quite some time. I took a Philosophy of the Internet course in university even—because I liked philosophy and I liked the Internet, so it seemed like a no-brainer. We are encouraged, especially those of us with white and/or male privilege, to view the Internet as this neutral, free, public space. But it’s not, really. It’s carved up by corporations. Think about how often you’re accessing the Internet mediated through a company: you read your email courtesy of Microsoft or Google or maybe Apple, and ditto for your device; your connection is controlled by an ISP, which is not a neutral player; the website you visit is perhaps owned by a corporation or serves ads from corporations trying to make money � this is a dirty, mucky pond we are playing around in, folks. The least we can do as a start is to recognize this.

Noble points out that the truly insidious perspective, however, is how we’ve normalized Google as this public search tool. It is a generic search term—just google it—and, yes, Google is my default search engine. I use it in Firefox, in Chrome, on my Android phone � I am really hooked into Google’s ecosystem—or should I say, it’s hooked into me. But Google’s search algorithms did not spring forth fully coded from the head of Zeus. They were designed (mostly by men), moderated (again by men), tweaked, on occasion, for the interests of the companies and shareholders who pay Google’s way. They can have biases. And that is the problem.

Noble, as a Black feminist and scholar, writes with a particular interest in how this affects Black women and girls. Her paradigm case is the search results she turned up, in 2010 and 2011, for “black girls”—mostly pornography or other sex-related hits, on the first page, for what should have been an innocuous term. Noble’s point is that the algorithms were influenced by society’s perceptions of black girls, but that in turn, our perceptions will be influenced by the results we see in search engines. It is a vicious cycle of racism, and it is no one person’s fault—there is no Chief Racist Officer at Google, cackling with glee as they rig the search results (James Damore got fired, remember). It’s a systemic problem and must therefore be addressed systemically, first by acknowledging it (see above) and now by acting on it.

It’s this last part that really makes Algorithms of Oppression a good read. I found parts of this book dry and somewhat repetitive. For example, Noble keeps returning to the “black girls� search example—returning to it is not a problem, mind you, but she keeps re-explaining it, as if we hadn’t already read the first chapter of the book. Aside from these stylistic quibbles, though, I love the message that she lays out here. She is not just trying to educating us about the perils of algorithms of oppression: she is advocating that we actively design algorithms with restorative and social justice frameworks in mind.

Let me say it louder for those in the back: there is no such thing as a neutral algorithm. If you read this book and walk away from it persuaded that we need to do better at designing so-called “objective� search algorithms, then you’ve read it wrong. Algorithms are products of human engineering, as much as science or medicine, and therefore they will always be biased. Hence, the question is not if the algorithm will be biased, but how can we bias it for the better? How can we put pressure on companies like Google to take responsibility for what their algorithms produce and ensure that they reflect the society we want, not the society we currently have? That’s what I took away from this book.

I’m having trouble critiquing or discussing more specific, salient parts of this book, simply because a lot of what Noble says is stuff I’ve already read, in slightly different ways, elsewhere—just because I’ve been reading and learning about this for a while. For a newcomer to this topic, I think this book is going to be an eye-opening boon. In particular, Noble just writes about it so well, and so clearly, and she has grounded her work in research and work of other feminists (and in particular, Black feminists). This book is so clearly a labour of academic love and research, built upon the work of other Black women, and that is something worth pointing out and celebrating. We shouldn’t point to books by Black women as if they are these rare unicorns, because Black women have always been here, writing science fiction and non-fiction, science and culture and prose and poetry, and it’s worthwhile considering why we aren’t constantly aware of this fact.

Algorithms of Oppression is a smart book about how colonialism and racism are not over. They aren’t even sleeping. They’ve just transformed, rebranded for the 21st century. They are no longer monsters under the bed or slave-owners on the plantation or schoolteachers; they are the assumptions we build into the algorithms and services and products that power every part of our digital lives. Just as we have for centuries before this, we continue to encode racism into the very structures of our society. Online is no different from offline in this respect. Noble demonstrates this emphatically, beyond the shadow of a doubt, and I encourage you to check out her work to understand how deep this goes and what we need to do to change it.

604 reviews169 followers
July 23, 2020
Noble provides plenty of instances where search engine algorithms (Google’s especially) have produced racially unfortunate and offensive results, has outrage aplenty about the structural racism and exclusions that exist in society at large, and doesn’t think solutions can come through capitalist structures. All true, albeit banal, with nothing new added here.

What the book doesn’t do is explain WHY these search engines produce such malign results: is it a bias in the algorithm? in the training set? In the personalization engine? Are these algorithms a camera of society’s pre-existing biases, or are they an engine that (re)produces such biases? We never get any insight as to causality, only the profound sense of outrage that these problems exist. (There’s also an unfortunate reifying rhetoric: what exactly does it mean to say that, “The neoliberal political and economic environment has profited tremendously from misinformation and mischaracterization of communities�?)

Unsurprisingly, given that there is no causal account here, the prescriptive end of things ends up equally vague: there is some general demand for more “inquiry� and “attention� into issues of algorithmic oppression (based on a strawman argument that hardly anyone is paying enough attention to these issues), and some vague demands that tech companies hire more people with a degree in ethnic studies, but how exactly that will change things is unclear, given that there is no account of what is causing these search results in the first place.

Even her chapter length “critique� of library and information science, for the way they encode categories from “dominant� and “privileged� social groups, ends with the confusing note that the one thing worse that these hand-coded categories is the effort to avoid categories altogether. Well, OK, but then what categories should we use? The closest Noble comes to suggesting an alternative is when she suggests, briefly, “In my own imagination and in a project I am attempting to build, access to information on the web could be designed akin to a color picker tool or some other highly transparent interface, so that users could find nuanced shades of information and easily identify the borderlands between news and entertainment, entertainment and pornographers, or journalism and academic scholarship.� But isn’t the whole problem that such classifications are unclear and unstable, quite aside from whoever is doing the coding?

In sum this is like many other books that strike a “critical X studies� posture: long on their account of the oppressive power structures, but short on any sense of what a viable alternative would be like, other than the implication that maybe the world would be a better place if people like the author were put in charge of things.
Profile Image for Jenny (Reading Envy).
3,876 reviews3,639 followers
November 19, 2020
I read this for a work book club, and I'm glad I did, but some of it I find problematic. It falls prey to the usual non-fiction risks - the argument stretched out to book length makes less sense than a shorter version in a blog post or article would have, at multiple points she adds content that is tangentially related but dilutes the primary objective, and since she'd written on this topic previously by the time this was published some of the issues she had raised were already sorted out. And the way the book ends - with omg Trump is president, fake news, everything will change... well we already need an updated edition!

If it's true that how information is organized and accessed is beholden to who organized it and wrote the algorithm code for it, which I think she argues pretty strongly, then it is important to understand the underlying white supremacy in those structures and understand the problems they cause. I think she argues this point best, but the solutions are harder to find. She also doesn't close the loop about what it is that Google actually *is* doing - we couldn't figure out in my book club if this is because she simply doesn't know or she doesn't want to be liable for lawsuits of some kind. It does take some of the punch of the argument away.

The longest chapter, the one I suspect was formerly an article, is about how a search in Google for "black girls" brought back primarily sexualized content. She also shows that the very fact that this is no longer true, largely due to the issue she raised in social media, shows that Google has control over what is displayed and how results come back. Many people think Google shows the best answers but it's a lot more complicated than that - they are trying to use what they know about you and your location to provide the answer most people similar to you are looking for (try looking for the "first woman astronaut" if you don't believe me, and let me know if you find Sally Ride or not... she isn't, by the way.)

She discusses the racism and sexism of the Library of Congress classification system as well as Dewey Decimal, but some of her examples are misleading and applied to be more universal than they were used. No need to exaggerate, the racism and sexism are clear enough. She also doesn't explain, other than white supremacy, why the system is the way it is in the first place. I'm not sure if I think the strategy is to tackle subject headings individually, perhaps a new system would be better.... I wonder whose job it is to keep track of subject headings that change, and if anything about the historical access to information changes if we cover it up with new lingo. (Cookery is now cooking; can I still find the books that were catalogued under cookery? I'm not quite clear... not a cataloger.)

Where she really lost me is in her library database examples. She used ARTstor to show how searches prioritize white artists but in reality her searches just display a lack of understanding of how the database searches. You can't search for the words [black history] without telling the database what to do; it isn't like Google where it makes guesses and assumptions - unless you use a phrase search ["black history"] it adds an OR in between the words, which was obvious from the millions of results she got back. All three ARTstor examples were bad searches, not necessarily skewed results. (She has a PhD in information science so is this an intentional oversight so as to have clear examples? Confirmation bias? Can she really be in this field and not know about such things as phrase searches and boolean operators?) I would think these clunky databases are the anti-Google - they value all information equally and go looking for it based on the information you give them - and you have to be specific! (A better examination might have been in which contents were selected in the first place, since the study of art in the USA is probably largely Euro-centric, but she doesn't really go there.) Our database world is clunky, but maybe it's more about the content than the algorithm?

I did get a lot out of the discussion, and most of it related to the original Google content and my brilliant co-workers. I know librarians have a reputation of being anti-Google whether or not we are, but it is also true that most people using Google don't really know what it's doing to provide the results you seek. I'm not sure you'll have any better idea after finishing this book, which is why I can't give it more than three stars, unfortunately. I'm going to seek out more on these topics for my own understanding - one librarian had read and I know there are a few others that look at similar topics from a feminist lens as well. We discussed other ways we've seen Google taught and what might help students grow in their understanding (after we've also grown in our own.)
Profile Image for Ico Maly.
Author13 books82 followers
April 19, 2018
I must admit, I was very eager to read this book. A very necessary topic that really deserves to be tackled in a very thorough manner. Racism, sexism and discrimination in general are structural parts of societies in the West and we know that the net is no exception here.

I was really interested in finding out exactly 'how Search Engines reinforce Racism'. That they do it, is by now a common thing. We know for instance how the alt-right uses the internet and the algorithms of commercial platforms to push their racism into the world.

Safiaya Umoja Noble is an assistant professor in Information Studies at the university of California, Los Angelos. Considering this title I was expecting to read/find a methodologically well-established analysis of the algorithms of Google and how it contributes to the spread an rise of racism and sexism.

My enthusiasm was severely tempered when I started reading and found that the used methodology was shaky at best. Her methode consists out of 'Googling' words like black girls, and analyzing the result of that Google-session with a fixed framework of 'Black Feminism'. Even though I'm greatly sympathetic to the Feminism and antiracism as a scientific field, I'm missing a close analysis of the data.

Yes Black Girls gave pornofied results reflecting the huge commodification of women in the porn-industry. But similar results appeared if you search for latina girls and even white girls. But maybe more relevant, they do not appear anymore (at least not in Belgium). The point is, that methodologically speaking, this is shaky. Noble does not compare results with similar terms. She does not even try to Google in different settings (different browsers, different privacy settings, different computers, different countries, ...). The point is, you do not really know why you see what you see if you do not control your search (and even then, searching as a method is shaky).

The author clearly looks at Google as the all powerful (and thus the force to blame), she dismisses (probably correct) the idea that Google search reflects societal biases and racism. Google itself is to blame because 'they are the owners of their own algorithms'. True of course, but she nowhere makes it stick that Google's algoritmes are build as algorithms of oppression (as if they are racist themselves because of the lack of diversity on the floor).

What does stick is that Google is ranking things high up the list if (1) the are popular (get lots of links) and (2) if there are actors who are building sites with SEO in the back of their mind. And (3) that Google tries to intervene the least possible. I think that her argument that Google could and should edit the search results is powerful. Yes Google can do that. And yes they should to do that. Racism is not an opinion. And Google's algorithms are thus not neutral: they not only show what is popular, they make things popular. And thus they should take up 'editorial responsability'.

The problem with this book is that there is more 'moral claiming', than analysis of data. I'm highly sympathetic to the cause. I think that the concept of Algorithms of oppression has enormous potential, but it deserve a much deeper analysis in how exactly 'commercial algorithms' can be turned into Algorithms of oppression in the interplay between activists, Google and the broad audience.
Profile Image for Trish.
1,413 reviews2,683 followers
February 11, 2018
Noble began collecting information in 2010 after noticing the way Google Search and other internet sites collect and display information about non-white communities. Her results dovetail with other studies (e.g., ) positing that algorithms are flawed by their very nature: choosing & weighting only some variables to define or capture a phenomenon will deliver a flawed result. Noble wrote this book to explore reasons why Google couldn’t, or wouldn’t, address concerns over search results that channel, shape, distort the search itself, i.e., the search “black girls� yielded only pornographic results, beginning a cascade of increasingly disturbing and irrelevant options for further search.

In her conclusion Noble tells us that she wrote an article about these observations in 2012 for a national women’s magazine, Bitch, and within six weeks the Google Search for “black girls� turned up an entire page of results like “Black Girls Code,� Black Girls Rock,� �7-Year-Old Writes Book to Show Black Girls They Are Princesses.� While Noble declines to take credit for these changes, she continued her research into the way non-white communities are sidelined in the digital universe.

We must keep several things in mind at once if the digital environment is to work for all of us. We must recognize the way the digital universe reflects and perpetuates the white male patriarchy from which it was developed. In order for the internet to live up to the promise of allowing unheard and disenfranchised populations some voice and access to information they can use to enhance their world, we must monitor the creation and use of the algorithms that control the processes by which we add to and search the internet. This is one reason it is so critical to have diversity in tech. Below find just a few of Noble's more salient points:

We are the product that Google sells to advertisers.

The digital interface is a material reality structuring a discourse, embedded with historical relations...Search does not merely present pages but structures knowledge...

Google & other search engines have been enlisted to make decisions about the proper balance between personal privacy and access to information. The vast majority of these decisions face no public scrutiny, though they shape public discourse.

Those who have the power to design systems--classification or technical [like library, museum, & information professionals]--hold the ability to prioritize hierarchical schemes that privilege certain types of information over others.

The search arena is consolidated under the control of only a few companies.

Algorithms that rank & prioritize for profits compromise our ability to engage with complicated ideas. There is no counterposition, nor is there a disclaimer or framework for contextualizing what we get.

Access to high quality information, from journalism to research, is essential to a healthy and viable democracy...In some cases, journalists are facing screens that deliver real-time analytics about the virality of their stories. Under these circumstances, journalists are encouraged to modify headlines and keywords within a news story to promote greater traction and sharing among readers.
An early e-version of this manuscript obtained through Netgalley had formatting and linking issues that were a hindrance to understanding. Noble writes here for an academic audience I presume, and as such her jargon and complicated sentences are appropriate for communicating the most precise information in the least space. However, for a general audience this book would be a slog, something not true if one listens to Noble (as in the attached TED talk linked below). Surely one of the best things this book offers is a collection of references to others who are working on these problems around the country.

The other best thing about this book is an affecting story Noble includes in the final pages of her Epilogue about Kandis, a long-established black hairdresser in a college town trying to keep her business going by registering online with the ratings site, Yelp. Noble writes in the woman’s voice, simply and forthrightly, without jargon, and the clarity and moral force of the story is so hard-hitting, it is worth picking up the book for this story. At the very least I would recommend a TED talk on this story, and suggest placing the story closer to the front of this book in subsequent editions. For those familiar with Harvard Business Review case studies, this is a perfect one illustrating issues of race.

Basically, the story is as follows: Kandis's shop became an established business in the 1980s, before the fall off of black scholars attending the university "when the campus stopped admitting so many Blacks." To keep those fewer students aware that her business provided an exclusive and necessary service in the town, she spent many hours to find a way to have her business come up when “black hair� was typed in as a search term within a specified radius of the school. The difficulties she experienced illustrate the algorithm problems clearly.
“To be a Black woman and to need hair care can be an isolating experience. The quality of service I provide touches more than just the external part of someone. It’s not just about their hair.�
I do not want to get off the subject Noble has concentrated on with such eloquence in her treatise, but I can’t resist noting that we are talking about black women’s hair again…Readers of my reviews will know I am concerned that black women have experienced violence in their attitudes about their hair. If I am misinterpreting what I perceive to be hatred of something so integral to their beings, I would be happy to know it. If black hair were perceived instead as an extension of one’s personality and sexuality without the almost universal animus for it when undressed, I would not worry about this obsession as much. But I think we need also to work on making black women recognize their hair is beautiful. Period.

By the time we get to Noble’s Epilogue, she has raised a huge number of discussion points and questions which grew from her legitimate concerns that Google Search seemed to perpetuate the status quo or service a select group rather than break new ground for enabling the previously disenfranchised. This is critically important, urgent, and complicated work and Noble has the energy and intellectual fortitude needed to work with others to address these issues. This book would be especially useful for those looking for an area in the digital arena to piggyback her work to try and make a difference.

Profile Image for Richard Derus.
3,605 reviews2,180 followers
April 24, 2022
Real Rating: 4.5* of five, rounded up because dry reading is still important reading

I RECEIVED A DRC FROM THE PUBLISHER VIA EDELWEISS+. THANK YOU.

My Review
: The world, as the Internet has shaped it, took a promise of information access and educational opportunity unparalleled in human history and screwed it up to the point it reinforces the evils and stupidities it could so easily have alleviated.

The problem, it transpires, is both blindness..."*I* am no racist, or a sexist! Why, some of my best friends..." is not new, nor is it uncommon in any society...and neither is hubristic malevolence (Cambridge Analytica, for example). We're two decades in to a giant, uncontrolled social experiment. Voices like Author Noble's are still notable for their infrequence of prominence in the rarefied world of Congressional hearings and the European Union's creation of the GDPR.

The issues that Author Noble raises in this book need your attention. You, the searcher, are the product that Google and the other search engines are selling to earn their absurd, unconscionable, inadequately taxed profits. Every time you log on to the internet, Google knows...use other search engines, never click on any links, and Google still knows you're there. That's the Orwellian nightmare of it...like East Germany's Stasi, they're everywhere, in every website you visit. Unlike the Stasi, they are possessed of the capacity to quantify and analyze all the information you generate, and sell it to anyone who can use it. For you or against you, as long as the check clears, Google and its brethren couldn't care less.

(There are links to information sources in the blogged version of this review at .)
Profile Image for Mariella.
466 reviews6 followers
July 5, 2018
Interesting and so appreciative of Safiya who got the conversation started a few years ago about bias algorithms. But this book would have been a better article because it was extremely repetitive.
Profile Image for Mehrsa.
2,245 reviews3,601 followers
June 28, 2018
I would give the book 5 stars for how important this research is. But the book is not easy to read. It feels like it's written just for academics. Besides bulky language, there are a lot of references to other work and literature. It's also a bit unfocused and disorganized. However, she is absolutely right that this is a huge problem that people are not paying attention to. It's a really hairy problem that needs more books like this to dissect and to make tech aware of their power in reienforcing racism.
Profile Image for Bethany (Beautifully Bookish Bethany).
2,626 reviews4,526 followers
January 20, 2022
Algorithms of Oppression raises some important questions about Google search and how it reinforces and is formed by racist, sexist, and homophobic beliefs. Note that this book is primarily a theoretical text, seeking to open lines for future research in related fields. It is also VERY academic and assumes a lot of background knowledge from the reader in terms of communication theory, feminist theory, and more. I have enough of an academic background in those things to understand it, but the typical reader might find this both dense and frustratingly surface level because it is so theoretical in scope rather than practical.

We can talk about this book as it relates to the average reader, and as it relates to academia. Thinking about the average reader picking it up because it's an intriguing title, it's possible this could be a useful primer on the existence of structural inequalities on the internet. I can't say I found much in this book to be particularly surprising, but I think for readers who are unfamiliar with how to properly conduct online research, with how google actually works from a business perspective, or with ideas about how human biases impact AI and machine learning algorithms, this could be a useful introduction. The author uses specific cases and some textual analysis to illustrate the problems, for instance the fact that not too long ago a search for "Black girls" on Google, mostly yielded pornographic results.

As an academic text, I think there are valuable questions raised here that scholars in related fields should pursue in more practical ways. Specific research studies into some of the phenomena mentioned, plans for policy changes, experimental studies....this book offers a wealth of possibilities for really important things we should be considering and touches on the real world implications of doing nothing. I do think some of the chapters make assumptions that go too far.

In discussing pornography, the author is primarily opposed on the basis that it creates objects of (specifically) Black and brown women's bodies. She touches briefly on the idea that women have some agency in their own sexualization, but is fairly dismissive of it. I think it needs added nuance to differentiate between harmful, predatory, misogynist instances of it, versus the validity of women choosing sex work in different forms for a variety of reasons.

What's interesting is she recognizes how wrong it is for women to lose their jobs because images of them when they were younger are found online (this is definitely wrong), but doesn't seem to link that to the idea that normalizing and creating safety protocols around sex work might actually be a better fix. This is a bigger conversation, and as the author says herself in the introduction, this book is partly a product of the time it was researched and written, about the internet which moves very quickly.

Since that time Black feminist writers like Mikki Kendall and Sesali Bowen (among others) have written compellingly about the need to stop viewing sex work as shameful. Since that time we have had a global pandemic and seen a major rise in women engaging in virtual sex work by choice. Things are shifting and while that's not the primary thrust of this book, it is a part of it that feels out of sync.

I think there is absolutely value to this book and it's worth reading. Just recognize that it should be viewed as a primer and a theoretical text- as a starting point, not an end point.
Profile Image for مجید اسطیری.
Author8 books527 followers
December 17, 2022
احتمالا خیلی از ما اینترنت را یک فضای دموکراتیک یا حداقل دموکراتیک تر از رسانه های دیگر فرض میکنیم اما خانم صفیه نوبل این تلقی غلط را در کتابش به چالش میکشد و مصادیق فراوانی از برخوردهای سرکوبگرانه، کلیشه ای و دیکته شده از الگوریتمهای اینترنتی نشان میدهد.
البته تمرکز اصلی کتاب روی موتورهای جستجوگر و مخصوصا گوگل است اما اشاراتی که نویسنده به صورت کلی در مقدمه و موخره هر فصل دارد دقیقا درباره تمام الگوریتمهای مجازی صدق میکند و به سادگی میتوانید به جای گوگل بخوانید توئیتر، اینستاگرام، فیس بوک و ...
یک کاستی قابل چشم پوشی کتاب هم تمرکز بیش از اندازه اش بر زنان و رنگین پوستان است که به گواهی آمار و ارقام خیلی بیشتر از الگوریتمهای سرکوبگر صدمه میبینند. اینجا هم مثالهایی که نویسنده از نادیده گرفته شدن یا کالایی شدن این دو قشر می آورد بسیار قابل تعمیم هستند. در واقع زنان رنگین پوست ( که نویسنده خودش جزو این دسته است) بهانه ای هستند برای مداقه در چیستی و چگونگی عملکرد موتورهای جستجوگر. البته کتاب خیلی فسلفی به ماجرا نگاه نمیکند و بیشتر به دنبال احقاق حقوق و آزادی های پایمال شده است.
به هر حال خواندن این کتاب در زمانه ای که به معنای دقیق کلمه زیر بمباران الگوریتمهای سرکوبگر هستیم ضروری است.
داشتم به آزمایش فلسفی "مغز در کوزه" دکارت فکر میکردم. دیدم دیگر حتی نمیتوانیم بگوییم: میاندیشم، پس هستم! واقعا دیگر این ما نیستیم که می اندیشیم

چند پاره از کتاب:
"برخلاف ایمان گسترده ما به اینترنت به عنوان یک فضای دموکراتیک که مردم قدرت مشارکت برابر دارند، اینترنت در واقع به سود نخبگان قدرتمند سازماندهی شده است، از جمله ابرشرکت هایی که می خواهند فروش خود را افزایش دهند یا کاری کنند جست وجوهای مردم به تارنماهای ایشان منتهی شود."

"سرشت سیاسی جست و جوی اینترنتی ثابت می کند که الگوریتم چیزی نیست مگر مداخله بنیادین دانشمندان رایانه در واقعیت"

برخی کاربران اینترنت، عاملیت بیشتری داشته و بر اینترنت حاکمیت دارند؛ درست برخلاف این دیدگاه آرمان شهری و خوش بینانه که اینترنت یک نیروی به لحاظ اجتماعی، برابرساز و دموکراتیک است."

"بی گمان مردم سواد الگوریتمی ندارند و باید دراین باره اطلاعات بیشتری داشته باشند. اما همه بسترهای دیجیتالی ای که در این کتاب بررسی خواهم کرد، خصوصی هستند؛ درنتیجه، اگر سواد الگوریتمی می داشتیم، بازهم نمی توانستیم بر این بسترهای خصوصی و شرکتی کنشگری کنیم."

"چیزی که جست وجوهای گوگل را انجام می دهد یک دستگاه داده پرداز همه چیز تمام نیست. بله، گوگل سرچ یک شرکت تبلیغاتی است نه یک شرکت معتبر در زمینهٔ اطلاعات. دست کم باید از خودمان سؤال هایی بپرسیم: از چه زمانی فهمیدیم گوگل بهترین اطلاعات را بالا می آورد؟ این اطلاعات برای چه کسی بهترین است؟ مخاطب هدف محتواهای موجود کیست؟ آیا ما در یک «حباب فیلتری» هستیم؟"
Profile Image for Krystal.
387 reviews24 followers
October 24, 2017
Dr. Safiya Umoja Noble deconstructs the myth of online democracy with this brilliant illustration of how white male supremacy has adapted its course to the ever expanding reaches of the internet!
Profile Image for Adam Schweigert.
60 reviews15 followers
February 18, 2019
Raises some good questions but is woefully light on proposed solutions. Mostly it seems like the author is not happy about how advertising supported online services work (fair) but doesn't really get very far in proposing anything resembling a commercially viable solution. The book also tends to wander a lot, repeat itself and use a lot of academese to obscure how half-formed some of these ideas are. Wish I could recommend it. It's an important topic but this just didn't deliver.
1 review
November 23, 2018
Unresearched and uncultivated writing in many ways. I found this book to be of little value both from a SEO/algorithmic-validity point of view, as well as (most alarming given the author's background) from a sociological point of view, its analyses being far too simplistic. Nor does it extend well to offline writings either. Full of claims that cannot be proved and logical fallacies, this is a book that I could not honestly recommend.
Profile Image for Gabriella.
446 reviews310 followers
January 20, 2025
Actual Rating: 2.75 stars

Dated, but still relevant! January 2025 was a very interesting time to read Safiya Noble’s Algorithms of Oppression: How Search Engines Reinforce Racism. In some ways, this book felt very relevant to our current dissatisfaction with search engines, and in other ways, I felt like our current conversations about AI have rapidly eclipsed the consequences discussed in this book (which is only 7 years old!)

It’s funny to read this in the middle of the TikTok conspiracy, because right before Trump’s stunts to “save� the app took form, I saw many (young) people bemoaning the loss of our preferred search engine. For the exact reasons that Noble mentions in her book (and particularly the story in the conclusion), many of us rely on TikTok or Reddit as more reliable search pathways than Google. Whether this is truly the case or not, the former platforms are perceived to have search algorithms that are less influenced by the desires of advertisers, and more directed by the opinions of our peers.

I think this shows just how the years have supported Noble’s central arguments about the limits of commercial search engines, and the broader doom that uncritical embrace of technology can cause. A growing number of people I know now firmly believe that tech monopolies are threats to democracy, that AI is a major human rights issue, and that commercial algorithms create many ethical and logistical challenges for their users. In some ways, this meant that reading this book felt like Noble was preaching to the choir. It’s likely that I am just not the intended audience for Algorithms of Oppression—it feels written to warn tech optimists, librarians, and other information workers about the long-term consequences these tools can have if they are uncritically embraced. However, in 2025, this is an argument I didn’t need to be convinced of—it’s evident in nearly every facet of our lives!!!

In Noble’s defense, this book is still a helpful source of specific details on Google/Alphabet’s monopoly on search, even for people who are already prone to question the company. I will try to go chapter-by-chapter to describe some of the main things I learned.

Chapter 1: A Society, Searching
In Chapter 1, Noble sets up a core argument of the rest of her book: that Her central point: Google creates advertising algorithms, not information algorithms. Because of this, the search engine isn’t just reflecting back people’s bigotry, it’s helping influence it—because doing so is profitable! This was a really helpful concept for me to understand, as I am often puzzled by how many people I know have been sent down incel and hypergamy rabbit holes (otherwise known as alt-right pipelines) with more and more content that leaves them sounding incredibly deranged. Years before all of this really kicked off, Noble provides a helpful answer. Because search engines (particularly Google) have now become our primary information source and portal to the internet, their algorithms and the places they guide us to take on outsized power in our information gathering processes. Noble shows us that this wasn’t always the case, something my dad and aunt later confirmed in a conversation we had about this book.

In their youth (70s, 80s, and 90s), my relatives shared with me that they used to ask teachers, librarians, and other knowledge keepers before consulting the encyclopedia for answers. Even after consulting the encyclopedia, they might have to search several related resources before finding the exact answer they sought—in the process being connected to many different viewpoints. By contrast, the information gathering process is deceptively “easy� in my lifetime. For each assignment I’ve done, I’ve used a commercial search engine during at least one part of the process. I’m not saying that the sources of old were perfect, but my modern over-dependence on a single information source (Google) is certainly troubling. Noble agrees with this trouble, noting that part of the issue is that people don’t even realize commercial search engines aren’t objective portals to the internet. In one study, Noble shares that many people couldn’t even tell the difference between paid advertising and “genuine� results. And of course, even the “genuine� results are influenced by SEO metrics to also participate in the advertising algorithmic game. Spooky stuff, all around!!!

Chapter 2: Searching for Black Girls
Not going to lie, this chapter was disappointing. Noble focuses on how many Google results for “black girls� led to porn sites, how there’s a lack of social sciences training amongst Google’s workforce, and how Black Girls Code programs aren’t going to fix any of this. I did enjoy her exploration of the question “if Google isn’t responsible for its algorithm, who is?� Unfortunately, her answers are just too convoluted—her writing style is prohibitively circuitous, even for an academic. If there was a more serious inquiry in this chapter than “we need more humanities education for STEM majors� and “we need better algorithmic representation of Black women�, I couldn’t find it.

In Algorithms of Oppression, Noble constantly pontificates about what she plans to reveal to us, only to waste countless paragraphs on much smaller revelations. One example is the following quote on page 108: “Whether or not one cares about the specific misrepresentations of women and girls of color or finds the conceptual representations of teenagers, professors, nurses, or doctors problematic, there is certain evidence that the way that digital media platforms and algorithms control the narrative about people can have dire consequences when taken to the extreme.�

From this quote, I was hoping that the rest of the book would discuss these dire consequences, which connects to the issues she raised in the introduction—the human rights concerns of AI, and the truly nefarious deeds of the commercial search companies. Unfortunately, Noble never makes good on these promises by truly expanding her research to fully address these issues.

Chapter 3: Searching for People and Communities and Chapter 4: Searching for Protections from Search Engines
I lumped these two together as they’re much shorter chapters than 1 and 2, and mostly relate to each other in scope. They both provide “real life examples� of the dangers of commercial search, but still mostly sensationalized examples that don’t really speak to Big Tech’s role in labor exploitation and war profiteering.

Chapter 3’s example discusses how Dylan Roof was radicalized through alt-right websites and information, a path that started with a Google Search and ended with the murder of 9 worshippers at Mother Emmanuel AME Church in Charleston. While this is a tangible example of how the representation items discussed in Chapter 2 can lead to very dire consequences, I just didn’t buy Noble’s argument that search really LED to this outcome. It might have made it easier, but bigots have existed and performed hate crimes for long before the internet existed. I just didn’t believe that this mass murderer really wouldn’t have been able to carry out his terror without Google.

Chapter 4 also includes more real-life dire consequences: people being fired or harassed at work due to previously doing porn (including revenge porn), and having no way to wipe said porn from the internet. This sort of algorithmic oppression, where Google is incentivized to keep nonconsensual porn results in order to boost its web traffic, prevents the victims from having “the right to be forgotten.� I did think this was a very interesting topic, especially locally (NC is now one of the states that age restricts porn sites, leading to a whole other dumpster fire of debate.) This right to be forgotten is even more complicated by many people’s growing desires for accountability processes that require someone’s history to be available for access. It’s hard to say which side is right—I do get concerned about abusers and cult leaders rebranding and people forgetting about it, for instance. But, I also truly want revenge porn victims to have the ability to recreate themselves, and don’t see a way to get one without the other. This is an area where I think Noble was right to leave the questions up to the reader.

My favorite example of search ethics actually comes closer to the end of Chapter 4, where Nble introduces the debate about digitizing previously niche records. She argues that maybe everything shouldn’t be digitized, because some people consented to their inclusion in publications with the understanding there would be a very niche audience of viewers, mostly within their IRL communities. The internet makes all of this much less for certain! The case used to explain all this is the digitization of On Our Backs, a lesbian erotica magazine that was also mentioned in another book I read this month, Krista Burton’s Moby Dyke: An Obsessive Quest to Track Down the Last Remaining Lesbian Bars in America. For this argument, Noble references librarian Tara Robertson’s criticism of the OOB digital archive, entitled . I’d really recommend reading Robertson’s blog post in addition to this chapter of Algorithms of Oppression, because it provides more context for how communal zine creators, lesbian pornographers, and other politically-minded artists are trying to protect their work in the digital age.

Chapter 5: The Future of Knowledge in the Public
This chapter is all about library and information sciences, which makes sense given Noble’s academic background and intended audience for this book. I was fortunate enough to have a buddy reader who is a current library worker and MLIS student. Reading this with them helped Noble’s points to take on a bit more meaning and relevance to my friend’s work experiences. However, outside of this sort of context, I’m not sure that I would recommend this chapter to general library patrons like myself.

Prior to reading this, I’d say I understood Noble’s argument that classification systems are socially constructed and thus “hold the power biases of those who are able to propagate such systems.� (136). However, after reading this chapter, I don’t think I felt incredibly enlightened by a deeper understanding of how search algorithms are drawing on the racist classification systems created by librarians of the past. I could see how such a criticism could be helpful in a course that is trying to address . But outside of this audience, it may not be as worthwhile.

To not be toooo negative, I will say that Noble’s connection of the information sciences concept of the “average reader� to the tech concept of the “universal human� is brilliant. This reference connects Chapter 5 and Chapter 1 topics neatly, and shows how both biased concepts fail to create non-oppressive search portals. If nothing else, Noble is helping to rid readers of any notion they have that tech choices are neutral. As she notes on page 148: “Commercial search, in the case of Google, is not simply a harmless portal or gateway; it is in fact a creation or expression of commercial processes that are deeply rooted in social and historical production and organization processes.� In other words, a corporate search algorithm isn’t just a blank vessel to our information journey, it’s a commodity customized to the interests of its shareholders and deeply influenced by the flawed organizational systems and society that comes before it. This is a helpful reminder, for sure!!!

Chapter 6: The Future of Information Culture and Conclusion
Noble concludes this work with more notes about how search engines are the “primary portals� to our internet access, and the internet is supposed to be the main communication source of the future. However, she shares her concern that these portals are relatively unregulated compared to other sources of media, and there is no real solution to this in sight. In addition to sharing these concerns about the future, Noble does include some minor references to the human rights injustices faced by Black people in other countries, like Congo and Ghana, in order to uphold these tech conglomerates. Unfortunately, these references are “too little, too late�, and they also are followed up by a callous comparison of child slavery to Black Americans� misrepresentation within the algorithm. Like someone not being able to Google their community accurately or having their dance stolen on TikTok is bad, sure, but I feel like those sorts of issues shouldn’t even be described in the same paragraph as the other injustices. But apparently, Noble disagrees. what was just described. I just don’t get that at all!! Finally, I did enjoy the anecdote from the hairstylist that is discussed in the Conclusion. Unfortunately though, I wished Noble had found someone who was having similar challenges with Google and not Yelp—it just seemed like a bit of a diversion from the main platform she was criticizing in the rest of the book.

Final Thoughts
I was explaining my complaints with this book to my historian friend, and they made the great point that Noble intended this book as a warning to people who are incredibly tech-optimistic, and charging full-speed ahead into more entanglement with AI and commercial search algorithms. I think Noble does succeed in warning this group, but even outside of the audience, the main issue is that her points take sooooo much patience to wade through. I’m all for a challenge, but this just felt like an unnecessary challenge as a reader. So many times, I wanted to shake Safiya Noble by the shoulders and tell her to GET TO THE POINT, like seriously spit it out!!! Stop saying what you’re GOING to do and just DO IT IN THIS PARAGRAPH. Ugh!!!!

However, if you are a more patient reader than I am and don’t tend to be bothered by academic dillydallying, maybe check this one out. I can say that it does already have me switching up some of my search patterns, and trying to use routes to information that includes humans, physical books, and of course, TikTok. I still may not be succeeding in this entirely, but Noble certainly drove home the importance of not ceding our entire information processes to Google.

So, at the long end, I would say this wasn’t a complete waste of time, just a book that is not as groundbreaking in 2025 as it would’ve been in 2018. Most of the arguments Noble breaks down at an excruciating pace are self-evident to most critical internet users of today, and we now have other options to engage with these arguments. For example, I think Jane Pek’s 2022 mystery novel, The Verifiers, makes similar arguments in a much more enjoyable format. I understand that many people still appreciate an academic read, so I won’t say that no one should check this book out. However, if you aren’t a library sciences student or a glutton for punishment, I definitely wouldn’t start here.
Profile Image for Walter Ullon.
318 reviews153 followers
January 8, 2023
TL/DR: bloated article about other's people thoughts on the matter. Zero understanding of search-algos and popular culture in general. Ridiculous solutions. If you'd like to know more about bias in algorithms and modern tech, look into: "Weapons of Math Destruction" and "Technically Wrong", which were actually written by qualified (tech) professionals in the field.

Seriously, I gave this one a chance but the book is a tease in the first half and a disappointment in the second.

The writing is obtuse, riddled with circular arguments, name-dropping, and the use of other people's better arguments about the very thing you're supposed to be an expert about. Which is a shame because it is an important subject that deserves better treatment. Noble's arguments ultimately devolve into "trust me, I'm right because such-and-such wrote about it already in...and I agree!" Don't believe me?
"Recent research on Google by Siva Vaidhyanathan...who has written one of the most important books on Google to date, demonstrates its dominance over the information landscape and forms the basis of a central theme in this research."
And here again,
"Frank Pasquale, a professor of law at the University of Maryland, has also forewarned of the increasing levels of control that algorithms have over the many decisions made about us, from credit to dating options...."
and again,
"The political-economic critique of Google by Elad Segev, a senior lecturer ... charges that we can no longer ignore the global dominance of Google and..."
and wait, there's more
"Molly Niesen at the University of Illinois has written extensively on the loss of public accountability by federal agencies such as the Federal Trade Commission (FTC), which is a major contribution..."

And here is this random jewel, just cause I can: "Ultimately, this book is designed to “make it plain,� as we say in the Black community..." Really?

Alright, I know some of you are going to say that it is ok to cite other people's work, but beyond her statements, no further exposition is offered. The whole thing is like this. I'm not sure I recall a single original argument.

What's worse, It takes Noble roughly half the book to end her long intro about her plan of attack. Thirty-seven pages in she is still telling you:
"This work is addressing a gap in scholarship on how search works and what it biases, public trust in search, the relationship of search to information studies, and the ways in which African Americans, among others, are mediated and commodified in Google."
Can we get on with it?

OK, but let's say you've bought the argument that tech is decidedly out to get you, and appreciated the moral expounding. Surely the author is ready to drop some solutions "like it's hot" (as they say in the Black Community...) to this problem, right? Well, her solution for all this is a tad comical, if not heroically ironic:
“In my own imagination and in a project I am attempting to build, access to information on the web could be designed akin to a color picker tool or some other highly transparent interface, so that users could find nuanced shades of information and easily identify the borderlands between news and entertainment, entertainment and pornographers, or journalism and academic scholarship.�
Break out your crayons and stop your engineers Google, all you need is a color palette!

Ugh. If only the absence of color-blindness could be fixed with more color eh? Search results are not primarily the problem, lack of critical thinking skills is. Blocking misleading, inflammatory results for black-on-white crime cannot be the solution when there are people out there with racial anxieties worked to a frenzy that will keep looking until they find what matches their worldview.

If all this weren't sad enough, in a last-ditch effort to end on a strong note, she caps the book off with a piece about Yelp and its business model, but I thought we were talking about Google?

Cannot recommend.
Profile Image for Qurrat Ahmad.
53 reviews5 followers
November 16, 2018
I LOVED the premise of this book and it definitely caused me to Google (ha) a lot of articles on the subject after. What I learned from the book is that algorithms are absolutely oppressive (not mind blowing but nice to see lots of examples to acknowledge this). What I didn’t learn was why, and what we can do about them. The author had all of the anger but none of the solutions (I don’t think sanctioning Google or forcing ethnic studies majors into tech is the answer). But I don’t know what is. Therefore, this could have been a much better article than a book.
Profile Image for Emily.
687 reviews674 followers
May 23, 2018
This book is hard to evaluate because its author has certainly made a five-star contribution to our profession, but consuming this particular work, as a reader, left me wanting more. For ŷ, I'll write this up as I reacted to it reading it on my sofa for my own interest rather than as I would for a professional review. And in that light, I have to say that I'm more interested in technology and policy (which this is not about) than critical theory (which it is). Noble does a thorough job of situating her inquiry in Black feminist thought but at every turn I found myself wanting her to pivot from what is happening to how.

The main point of the book is that Google results promote racist and misogynistic perspectives, especially on women of color, while being perceived as a neutral or authoritative source. The example she returns to repeatedly is of a search for "black girls," which until ca. 2012 returned porn sites or racist screeds instead of sites like "Black Girls Code" or "Black Girls Run."

The book treats Google as a sort of disembodied omnipresent leviathan, which is certainly true insofar as it manifests the id of millions of content creators while serving as a supranational monopolistic information resource. But it is also an organization with policies and practices, countable staff with ethics and motivations--and those could be interrogated. For example, we've learned (after the publication of this book, to be fair) about internal divisions over James Damore's memo and the company's participation in Project Maven. It is interesting to consider the interplay between the small number of Google staff and the vast number of non-employees whose content is hosted, amplified, or depressed by their systems. It is interesting that a group of humans came up with the tool Noble is critiquing and that they have been able to adjust it when challenged (i.e. they have some means of doing so) but have chosen to make changes as the result of narrowly identified problems instead of wholesale. That is, at least until the "Panda" release of 2011. That algorithmic change was billed as addressing content farms and SEO abusers but coincides with amelioration in some of the examples Noble illustrates. (It stands to reason since porn sites were probably prolific SEO abusers.) It is also my understanding that search results are currently coming from layers or combinations of algorithms that are significantly more complex than the original citation-count system of Brin & Page--so thinking of it as a single algorithm is too simple. At any rate, Noble doesn't get into these questions, portraying Google instead as a black box. I do see how this makes sense for the type of analysis she is doing, but I also see it as far enough from the truth to be not necessarily be a useful abstraction.

Searchers, too, have their own motives and information needs. Noble discusses the anti-Semitic results of a search for "jew" versus the results for "jewish people," which is how they are more likely to describe themselves. I never quite understood why someone with good intentions would search for only the term "black girls" without any other context--is this a search that someone would do about themselves, or a loved one? Meanwhile, given that Google is plainly involved in advertising, were porn sites the best-paying content for that search string? What I am wondering is whether the original, offensive search results are a successful manifestation of what Google is trying to do, which is to match up search intention with webpage in the way that makes Google the most money. I think Noble would agree that is what is going on, on a basic functional level. Then: Does Google have a social responsibility to work differently in certain circumstances? What are the parameters of those circumstances? And perhaps most importantly to me, can Google programmatically detect those circumstances instead of waiting on user backlash to inform them (and offending users in the meantime)? Or is there some other way to prevent offensive-results mechanisms from being deployed against content that we wish to protect (e.g. LGBT content for teens), in the style of Twitter-reporting wars?

Libraries willingly host material that is offensive (Mein Kampf being the classic example) and allow readers to come to their own conclusions, but certain barriers of respectability have historically been in play (like managing to get published). But that's no longer true, for example, when users are allowed to view pornography on public terminals. I'm not sure whether a critique of Google's offensive results is better stemming from the offensiveness per se or from a theory that users' information needs are not being met.

Can librarians hold Google to a stricter standard that we hold ourselves? I'm not sure Noble is asking for that. That's where I step away from my wonky complaints and get back to another of her central points, which is that Google's ubiquity makes it an opinion former as much as an opinion finder. She cites the example of Dylann Roof, whose searches on topics like "black on white crime" led him down a path of increasingly racist and violent content. The murders belong to him, but the content he found, shaped by Google's supposedly intention-free presentation, built his justifications. Google's level of influence goes far beyond libraries', and it's not clear that our approaches offer a solution that is sufficient either practically or morally. As our culture of misinformation reaches a crisis point, Noble is asking the big questions, such as whether we should seek to decouple search and advertising, break up Google, and/or replace Google with a public, noncommercial search option.

P.S. I recently read Everybody Lies, a book with a very different level of academic seriousness, but addressing related questions of what is revealed by searches people do in private.
Profile Image for Clare O'Beara.
Author23 books371 followers
June 25, 2018
Despite the author saying she intends this book as a practical project, her foreword has a first paragraph of 21 lines, and the introduction has a first paragaph of 23 lines; the first chapter includes a paragraph of 47 lines, and one of 37. The fact that I am studying journalism has doubly reinforced my belief that paragraphs that long won't get read by most people. I also think we could do without invented words like problematized.

Why make this kind of point about a book which describes race and gender bias in search engine algorithms? The author tells us Google said it was not responsible for what its algorithm did or how the results looked. Frank Pasquale tells us in Black Box Society that even coders don't entirely know how black box applications work. If something looks too complex, too dense a block, increasingly people, including students, think they can't get to grips with it. This is partly the twitter effect. Ask yourself if the Washington Post would present readers with a paragraph of 23 or 47 lines. Answer; not if it wanted them to keep reading.

I am not surprised that so many porn options came up for the author in her net searches. Porn does not come up in any of my searches. People started sending out mails randomly, advertising porn sites, which as we all should know, contain keyloggers, viruses and other malware from the first download. The stupid are being advertised to, and stupid, sexist terms are used. It's hard to legislate for stupid. Maybe ask search engines to require the term porn added to searches before porn will be produced. Google bombing and Gamergate are briefly referenced. I'm happy to have racists gather in their own little internet cave, where the FBI and other groups can watch them. Elect lawmakers who require a civil society. Meanwhile, do not put up with remarks in schools or workplaces. Run ad blockers and get a net nanny program on the family computer.

As for discriminating against black people by banking, discussed here, that would be banned where I live due to a law against discrimination on several grounds including race, family status, age, disability, gender etc. Thinking you're a victim of the process is easy. Proving it is harder. I may have been turned down for a mortgage by three banks due to being a single self employed woman. (Branches all said yes. Head offices said no.) But no doubt the head offices would say that all self employed people were being turned down that year. The joke is on them as I have now nearly finished my mortgage, paid some off early.
If you get enough such stories you can look at statistics, and big data is used by many firms now so it should be used by social justice campaigners too. The author admits that the EU has been more successful in enforcing anti discrimination laws than America. Economic redlining is mentioned in a quote from Cathy O'Neil, author of Weapons of Math Destruction.

Some interesting points are made, such as that France does not allow racial identity markers to be stored in databases. Screenshots are displayed, like Google's explanation for why searches for Jew or Jewish produce different results. Or for the black girls / Asian girls / white girls searches.
Even unprofessional hairstyles for work, which shows all black women, and professional hairstyles for work, which shows all white women. The author describes some of the recent history of media stereotyping and structural oppression for anyone not a white male; for instance, lack of access to venture capital.

As for white men opening fire on black worshippers; take the guns out of society. Police shootings; take the guns out of society. The Onion runs the same headline on every story of another US mass shooting. The author doesn't take this direction.

Another chapter looks at how a past in the porn industry can get women fired (we are not told about men) and about revenge porn. Also the EU legal 'right to be forgotten' which says we can have no longer relevant material removed. Not mentioned is that the US freedom of speech is stronger than in EU, so we have to wait for the metoo tweets from America; while a firm here exists solely to mail people in America informing them that they have been libelled on the net according to EU law and are eligible to sue. Swings and roundabouts.

Interested, I have just done a Google search for 'black girls'. No porn sites were offered, no distasteful terms, but the first several offers were dating sites. Next were Black Girls Code and Career Openings for black girls, followed by a running club. (These would have to admit boys and girls of all colours to comply with Irish law.) I suggest just typing in girls would produce similar results, with porn offerings in America, because that's where the rules make it easier for stupid people to get viruses from porn sites.

The later chapters look at net neutrality, broadband for everyone, opportunities for women with the net; terms like neoliberalism, neocolonialism trajectories, networked economy, abound. This is pretty dense as it mainly relates to running tech companies. Yet the author feels comfortable to say she has shined a light on... shined is not a word where I live. I would add that we can look at the history of computers and see where they were developed, who did the work, created the infrastructure and satellites. Whatever about being fair to everyone, be fair to the creators.
Mining in the Congo exploits people; so tell the Government of Congo to deal with it. Also, contact companies and ask them to make products with goods and labour that are fairly traded and environmentally responsible. They won't if you don't. (The environment is not mentioned that I noticed, though the REEs used to make tech goods are filthy to mine and refine.)

This book mainly relates to America and addresses Americans, but I am in many ways lucky to live in modern Ireland; where many women up until a few decades ago were enslaved for much of their lives by the church and forced to work in laundries without pay, freedom or social life to benefit the church, their babies taken and sold. They did not have to be black. They just had to be women. To women reading this book, I would say, read it, learn, and stop looking at makeup and clothes on the net. Stop using social sites that sell your data. Google makes an image of you related to your searches. An amusing anecdote from John Cheney-Lippold's book, We Are Data, not cited here, is that Google thinks a neuroscientist researcher who is a young woman, is actually an older man, because she spends all her time reading science articles written by older men. So I imagine they won't be advertising high heels to her. Give Google a more uplifting, intelligent image of women.

Notes and references P203 - 235 in my ARC. I counted three names which I could be sure were female, including Jezebel and Michelle Obama.
I see no reason for this paucity in presentation, as it's not good enough for an assistant professor in the department of information studies to say 'it's always been done this way' or 'that's how Zotero wrote it'. Credit women or they are invisible. Read the references in Naomi Klein's This Changes Everything. In We Are Data's references I counted 110 names that I could be sure were female.

I downloaded this book from Net Galley. This is an unbiased review of an ARC.
Profile Image for tiffany.
112 reviews14 followers
July 26, 2020
“we have automated human decision making and then disavowed our responsibility for it.�

this book really opened my eyes to how search engines like google search have become so ubiquitous in our daily lives, and yet the underlying algorithms that make choices about which results to filter, which to push to the forefront of the page, and which to exclude, are not at all impartial or apolitical. in fact, these algorithms reflect the underlying systemic racism and sexism prevalent in american society and perpetuate the spread of dangerous misinformation.

this book definitely highlighted critical examples of how algorithms perpetuate oppression, from yelp’s revenue-focused algorithm suppressing Black-owned small businesses to racial and gender stereotypes populating google search results to the spread of fake news sites that have cracked the code to game search engine algorithms. i found the author’s perspective as a Black feminist scholar particularly compelling when discussing issues of technology.

however, i did feel that the book got somewhat repetitive at times, especially in the middle, and am still confused as the call to action and the way forward is unclear. the author discusses a need for more salient public policy and a more focused critique on algorithms, but all in vague terms that are hard to actualize (though to be fair, the landscape of american politics and tech are so complex and polarized that the way forward in general seems murky in general).

overall, this book definitely made me much more critical of search engine algorithms and am frankly disturbed by how much i just took google search for granted as a fair, apolitical tool.

“Algorithms are, and will continue to be, contextually relevant and loaded with power.�
104 reviews
February 17, 2021
For a book with "Algorithms" in the title this didn't talk about them very much. I would say it was more about how control of information has been monopolized by tech companies, and how that control is exercised in unjust ways. In that way I feel like this book challenged the centrality of algorithms themselves, so the title seems kind of ironic. If the issue is "algorithms of oppression" that seems to invite a technosolutionism (just fix the algorithms) that I don't think Noble supports.

Overall this book was a bit muddled, but definitely had some interesting points.

One thing that was valuable was that Noble helped me think about the importance of diversity in tech in a new way. When you reduce everything to algorithms it can be hard to see why diversity matters because algorithms are "just math". For example, PageRank (the original Google algorithm) is a mathematical object that arises naturally from network theory. This means that saying "PageRank is racist" is like saying "addition is racist". So it seems like who the engineer is (e.g. their race) doesn't matter because they will ultimately derive the same algorithm, and algorithms are neutral.

But imagine if Sergey Brin and Larry Page were black women: would it have taken them 15 years to notice that searching for "black girls" returns mostly porn? Maybe they would have come up with PageRank and then immediately said "oops, that doesn't work very well! Maybe we need some more manual controls, or we could try a different algorithm, or maybe a search engine isn't such a great idea at all?"

Tweet-sized thought: is the algorithm, as a "neutral" authority, the new scientific racism?
Profile Image for Andrew.
906 reviews
June 13, 2020
While this work is not an easy read, it covers an important issue that is not sufficiently being dealt with. We now rely heavily on search engines created by the largest technology corporations and that they are biased in a way that return results, which are sexist, racist or actively promoting fake news articles, should give us pause. This book is a must-read.
Profile Image for Conor Ahern.
667 reviews212 followers
February 10, 2019
Sites like Google and Yelp have become so indispensable to our modern lives that it becomes easy to accept them as they are without interrogating their systemic flaws. As recently as this year, we saw people arguing in good (if benighted) faith that an algorithm . But Noble's experience and research showed that this is very demonstrably untrue: while searches for "white girls" bring up a sampling of mostly unoffensive images, the search for "black girls" brought up mostly porn until Google corrected the difference; typing in "This English major taught herself calculus" would draw Google's suggestion "Did you mean 'This English major taught himself calculus'?" and suggested autotext for black women is much more racist and sexist than the equivalents for white men or women.

So much of the internet, functioning as it does as an aggregate, is assumed to be benign. But it doesn't take much critical thinking to understand that bigotry--being a thing that all societies share in varying degrees--will be instantiated by group behaviors, particularly when they are done anonymously, as is the case with most of our internet activity. Noble spends this book unpacking the ways in which we give the internet too much credit for neutrality, showing that inaction by sites like Google and Yelp has serious ramifications, and pointing out the ludicrousness of their claims that they are powerless to anticipate, prevent, or mitigate the harm that is caused by the reification of the bigotry that gets pumped out by their algorithms.

This book is short and has the tone of a doctoral thesis. It feels like a good start to this important topic, but one that needs to be expanded on in great breadth and depth.
Profile Image for laurel [the suspected bibliophile].
1,898 reviews692 followers
January 29, 2019
A must-read for library and archivist professionals, and those who are looking to work for Big Data driven companies or companies looking to use Big Data.

While I wish there had been more of a call on how to fix and engaged instead of repeating what was wrong, over and over and that what was wrong needed to be fixed, this is a necessary read on how ingrained systematic racism and oppression is built into society.

And there is the reminder that algorithms aren't infallible, because they are built by people and people have bias and prejudice hard-wired into them.

Also, the narrator is fantastic! I could listen to her voice all day.
Profile Image for Sharad Pandian.
425 reviews154 followers
November 12, 2021
Read charitably, this book documents certain issues that result from the algorithmic categorization and ranking carried out by search engines, and argues that these issues have to be understood against the background of historical and systemtically promblematic practices and representations.

The problem with the book, however, is that she's remarkably uninterested in the actual algorithms, meaning she keeps vaguely gesturing around the problem rather than engaging with its details. Ultimately, it ended up reading like a very long literature review for an analysis that never materialized.
Profile Image for dandelion.
283 reviews15 followers
Read
October 6, 2019
[This review is from one of my graduate classes so that's why it sounds different from my previous reviews. I just removed all the citations.]

In her book Algorithms of Oppression: How Search Engines Reinforce Racism, Safiya Umoja Noble describes the several ways commercial search engines perpetuate systemic oppression of women and people of color. Critical race theory (CRT) and Black Feminist Thought lay the foundation of Noble’s research. CRT examines the relationship between race and racism in society and analyzes how marginalized individuals challenge the harmful societal perceptions. Black Feminist Thought investigates how both racism and sexism combined affect Black women and girls from an intersectional perspective that an examination of race and gender separately could never properly articulate their life experiences as a whole. Utilizing these approaches, Noble challenges the notion of objectivity of search engines and offers a new way at researching how intersectional identities are affected by systemic oppression based on how information is classified and preserved by those with power to do so.

The core of this book is Noble’s exploration of how the phrase “black girls�, when typed into the Google search engine, resulted in pornographic websites featuring Black women. It is a common misconception that search results are based on a person’s search history. Noble initially shared this belief, thinking her previous searches on Black feminist media would give her exactly what she desired; however, this was not the case. Instead, Google provided links leading to various pornographic websites, even though her original search included no terms related to the subject of sex. For what appeared to be an unknown reason, the search engine automatically associated the term “Black girls� with pornographic images. In order for a search engine to associate one term with another in this fashion, webmasters utilize Google’s AdWords to connect phrases together that they believe to be the most popular choices among search engine users. The U.S. pornography industry has perfected this model of search engine optimization. Noble explains:

Many of these techniques include long-term strategies to co-opt particular terms and link them over time and in meaningful ways to pornographic content. Once these keywords are identified, then variations of these words, through what are called ‘long tail keyword,� are created. This allows the industry to have users ‘self-select� for a variety of fetishes or interests.

The hypersexualization of Black women has a long and dark history dating back to the era of the transatlantic slavery. Several racial stereotypes regarding Black women birthed from slavery persist today, the Jezebel portraying them as “sexually insatiable and gratuitous�. Besides the issue of reinforcing the image of Black women as hypersexual beings, commandeering keywords associated with Black women removes their ability to effectively share and control information about their community. Information communities exist based on the needs of a group, through creation and use of information. Noble sought information to share with members of her community, her daughter and friends. Unfortunately, companies such as those that operate in the pornography industry pay significant amounts of money in advertisement on Google to disrupt the effectiveness of information sharing among Black women.

Though a popular and trusted resource for information seeking, Google constructs its search results not based on what is credible sources to the user, but on what is most profitable to the company. Google remains as the go-to search engine for people to use for information seeking. It has taken on the function of educator, where it is popular for people say “just Google it� when someone has a query (Noble, 2018). Further, the search engine is granted authority on what is or is not credible based on the ranking of a web page: if it is among the first listed, it is believed to be the most trustworthy on principle.

Research has shown in many cases, users have “a (sometimes unjustified) belief in their ability to filter the good and valid information from the faulty, hence their tendency to undersearch to find the highest quality information available�. One can begin to understand the hazards of a trusted search engine such as Google retrieving harmful stereotypical images of Black women on its first page of results. Users may take what is given as an accurate portrayal of a racial group while others, Black women particularly, suffer the consequences of the misrepresentation of their identities.

Ultimately, Noble makes a strong case against the false premise that algorithms are unbiased and objective in the information it retrieves. She points out that users can often put too much confidence in search engines, choosing to believe what is presented to be the best information. My biggest take away is that as someone who advocates for people to educate themselves on topics focused on social justice, I need to do more that tell people to "Google it". Unintentionally, I expect people to just "get it" without taking the above faults into consideration.

I highly recommend this book. It's rich in content and gives you a different perspective on search engines as commercial products as opposed to accurate information retrieval systems.
Profile Image for Shomeret.
1,112 reviews249 followers
February 3, 2019
I subscribe to the list for the Progressive Librarians Guild which recently established a group on ŷ. They decided to select Algorithms of Oppression by Safiya Omoja Noble as their first book for discussion. I downloaded a copy for review from Net Galley last year. So I prioritized it and started reading it as soon as I could. I finished it toward the end of January, and this is my review.

The foundation of Noble's argument is her discussion of how search works. This was an eye opener for me. It shouldn't have been, but I hadn't examined the topic critically. Noble interrogates our assumptions about search. What is search engine optimization? It means that some have found ways to game the system. Advertising is also a factor. As users, most of us would say that we are willing to tolerate advertising in return for free services. Are there limits to this tolerance? What if the advertisers are offensive to users? What if they promote racist or sexist attitudes?

Algorithms of Oppression is a significant book. Information professionals and students in the field should definitely read it, but I think it's also illuminating to anyone who uses search engines.

For my complete review see
35 reviews4 followers
May 7, 2021
I wanted to like this book but I found it to be a bit frustrating to read. There isn't a lot of quantitative investigation in the book. The focus is more on analytical interrogation of algorithms and their impact, but this was highly abstract and I found it difficult to be convinced of any particular portion, or feel like I was gaining new perspective or insights. I certainly walk away still believing that algorithms have sociopolitical implications even when they claim supposed neutrality, something the book does hammer home.

The book ends on a phenomenal case study of the impact of Yelp on a black owned salon, which packed a lot of insight into a small amount of space. I wish more of the book had centered around things like that.
Profile Image for Lynne.
Author103 books221 followers
May 16, 2018
Painstakingly researched, accessibly written book about the structural racism that gets introduced into algorithms as they reproduce the structural racism of everyday life.

This should be required reading for anyone working in the information field.
Profile Image for Matt.
61 reviews1 follower
December 19, 2022
Extremely good information, but could definitely use an update for 2022. There's a lot that has happened since this book was published.
Displaying 1 - 30 of 533 reviews

Can't find what you're looking for?

Get help and learn more about the design.