By an award-winning technology reporter for the Wall Street Journal, a behind-the-scenes look at the manipulative tactics Facebook used to grow its business, how it distorted the way we connect online, and the company insiders who found the courage to speak out.
Once the unrivaled titan of social media, Facebook held a singular place in culture and politics. Along with its sister platforms Instagram and WhatsApp, it was a daily destination for billions of users around the world. Inside and outside the company, Facebook extolled its products as bringing people closer together and giving them voice.
But in the wake of the 2016 election, even some of the company's own senior executives came to consider those claims pollyannaish and simplistic. As a succession of scandals rocked Facebook, they--and the world--had to ask whether the company could control, or even understood, its own platforms.
Facebook employees set to work in pursuit of answers. They discovered problems that ran far deeper than politics. Facebook was peddling and amplifying anger, looking the other way at human trafficking, enabling drug cartels and authoritarians, allowing VIP users to break the platform's supposedly inviolable rules. They even raised concerns about whether the product was safe for teens. Facebook was distorting behavior in ways no one inside or outside the company understood.
Enduring personal trauma and professional setbacks, employees successfully identified the root causes of Facebook's viral harms and drew up concrete plans to address them. But the costs of fixing the platform--often measured in tenths of a percent of user engagement--were higher than Facebook's leadership was willing to pay. With their work consistently delayed, watered down, or stifled, those who best understood Facebook's damaging effect on users were left with a choice: to keep silent or go against their employer.
Broken Code tells the story of these employees and their explosive discoveries. Expanding on "The Facebook Files," his blockbuster, award-winning series for the Wall Street Journal, reporter Jeff Horwitz lays out in sobering detail not just the architecture of Facebook's failures, but what the company knew (and often disregarded) about its societal impact. In 2021, the company would rebrand itself Meta, promoting a techno-utopian wonderland. But as Broken Code shows, the problems spawned around the globe by social media can't be resolved by strapping on a headset.
I just finished jeff horwitz's insightful book about critical years inside Facebook Inc., revealing internal research, debates, and the company's reluctance to address harmful systems. It is a heavy read - especially for someone who worked there for so long. I strongly recommend it.
I learned a lot about other parts of the company that weren't broadcast to most of the company. I left the company feeling like I was smelling a lot of smoke - and this book shows the fires that created that smoke. Especially at the end when I raised my concerns to other leaders I was admonished to "assume good intent", assured that I didn't know what I was talking about, and that good people were working on it anyway and making good decisions. It turns out that simply wasn't true.
If you want to better understand how Facebook works - and especially where things stand with a sole decision maker and a layer of execs who increasingly filter and shape the information that Mark sees, this book demonstrates it well. Jeff summarizes this painfully: "The broader picture that emerged was not that vile things were happening on Facebook - it was that Facebook knew."
I know some of these stories first hand - especially CrowdTangle and the work that Brandon Silverman was leading to champion greater transparency at the company. The work was ultimately compromised, favoring the creation of an opaque system to avoid public relations issues, rather than opting for greater transparency.
"You're told you're a wizard, that you will find the right answer, that the rest of the world just doesn't get it. I'd bought into that ever since I started in Silicon Valley, and when I looked back, I felt shame."
Reflecting on my time at Facebook, I realize now that the assurances I gave, based on the principle of 'assuming good intent,' were part of a broader narrative that sometimes hindered us from asking the necessary hard questions. This realization isn't about assigning blame, but about understanding how even well-intentioned beliefs can lead us astray. To those who shared their concerns with me then, know that your voices were important and have contributed significantly to my learning journey.
It is time for regulatory action. There is no accountability within the company. We learn in the book that despite not trusting the company, people still come back to use the apps.
It is time for our governing bodies to act. Taking action to better protect teens, creating better transparency through PATA (Platform Accountability and Transparency Act) and adding a duty of care provision to Section 230 are all steps that can help hold these largest platforms accountable.
Members of the public are encouraged to believe that Facebook and other technology companies are doing the best they possibly can to prevent or remove harmful content. This book shows unequivocally that that is simply not the truth. I am not a programmer and have only an intermediate level of understanding on how computers work, and this book is written in such a way that I could understand quite clearly how the different algorithms and moderation tools are used, as well as the decision-making that guarantees things Will go on the way they are now. There is occasional offensive tech bro language, which is only to be expected I suppose. Very eye-opening read.
4 1/2 stars rounded up. Thoroughly reported, with a lot of interesting detail here. I came in with a good background on Facebook/Meta, but still got enough new information to stay hooked. I liked the latter part of the book best, when we get more on Horwitz’s reporting/publishing and on Frances Haugen.
That Facebook/Instagram et al is a hellscape that is bad for anyone and everyone isn't exactly mindblowing news, but reading about how the philosophy that more content and engagement is law and nothing else must come to pass permeates every single facet of the business (and how Zuckerberg will find any possible way he can to pretend it is something more than just a business with eternal financial growth its sole aim) is fascinating and terrifying in equal measures. Also at one point FB accidentally delete themselves from the internet, and that was a much-needed bit of levity. p.s. unsurprised to learn that Nick Clegg is still spineless, boot-licking scum.
From Cambridge Analytica's collection of personal data, to Zuckerburg's $5 billion fine for psychological targeting in the 2016 presidential campaigns, to Frances Haugen's 2021 Senate testimonies, we have seen that the Facebook/Meta platform is rife with problems, scandals, and whistleblowers. The social impact of the platform is massive, multifaceted, and often infuriating and the algorithm has been proven to spread misinformation, amplify hate speech, and worse. In Broken Code, Wall Street Journal reporter Jeff Horwitz picks up where his "The Facebook Files" series left off, showing not only the failures of Facebook but what the company knew and disregarded for the sake of growth and profit.
Facebook came on the scene in 2004 and quickly began its climb to a major media company that had developed a worldwide influence. Its freestyle growth with algorithms being developed and changed, while it became filled with all kinds of bad guys. At one time it was stated that 50% of its content was toxic. Hate speech, revenge porn, and fake news made it place "intelligent" people advised followers to reconsider its use. The number of members, growth in value, and young leadership (Mark Zuckerberg) that loved the rise to fame while denying its downside: subverting of elections, brainwashing, and fueling genocide which finally led to congressional investigations. The leaders were shocked to find out one day that the most transmitted content that day had been a photo of an anus. Not what they had desired and caused a pause and then COVID hit. Jeff Horwitz, a Wall Street Journal reporter, took on the take to examine the fast-moving media giant and undertook his investigative research. He goes into a detailed explanation of the interworkings and the motivation that drove the company as it violated many of the principles with which it had begun. During the last few years, improvements have been made, but as the COVID crisis showed: medical advice came from non-medical personnel with everything suggested as a cure ranging from Kool-aid to bleach, and of 150,000 content postings, one-half came from a very small handful of anti-vaxers. Social media is just that, one must limit trust to whoever has earned it socially.
Getting to see the mindset of how some of the people were thinking, or at the very least presenting their thinking to the people working on the products is huge.
There are so many places in here where you see decisions they made and while some of them may be hindsight, there are more than enough that leave you wondering how the leaders could be so naive about what was happening on their platform.
This is a really good read and a good cautionary story for people who go into technology about thinking through how what you're creating can be abused. There are bad actors out there. There are people who will scam and swindle and abuse a public space like this to make money or just cause trouble. Refusing to acknowledging that and not taking active steps to address it is a huge misstep and a failure of leadership.
I hope at some point social media can become more of the public debate sphere it was dreamed of but until you step up and ACTIVELY WORK TOWARDS making it that, a few very loud and abusive voices are going to take control and game the system. The rage-bait is real and until steps are taken to diminish it, Facebook, and most of the other social media platforms, are never going to be a healthy place to have a real conversation.
I was pretty excited to read this book and I was definitely not disappointed. I was a little worried that I wouldn't understand the code lingo but you don't have to be a techie or programmer to appreciate this book.
I really like how this book delves into some serious moral issues of our time. Should a social media company be responsible for the harmful content that is uploaded on its site? Should celebrities and famous people have more privilege than us common folk in cyberspace? Should a social media platform address its influence on serious issues like eating disorders?
I can tell that Jeff Horwitz invested a lot of time into this book and the evidence is compelling. The premise that all is well and great in the Metaverse is pretty disturbing. The fact that Facebook and Instagram know they have profound effects on society yet willingly ignore them doesn't sit right with me. I won't lie, I enjoy using Facebook and Instagram but I have noticed a lot of people have backed away from Facebook. I can certainly see why.
It seems that corporations never want to admit that they are wrong because most people are too proud to admit their faults. Facebook likes to think it's a utopia of some sort but if anything, this book proves that it's more of a dystopia.
This book is worth the read. At first, it won't seem like it's covering anything you don't already know (even if only intuitively) about Facebook’s business and operational practices. But then the documents and the statements and the research pummel the truth to the surface of how Facebook’s leadership continually endorsed growth of their platform over its integrity, even when focusing on its integrity would have led to more profits and growth-- and even when people were being manipulated, exploited, and killed across the globe. This book offers data and research to back up some of your hunches about the platform. I just wish it ended with some sort of note about where to go now with all of this information... it left me hanging (which I suppose might be the point, as Horwitz is a journalist)...
Highlights: Mindblowing 🤯 Nerdy (especially for journalists), but still easy to read🤓 Techy 💻 Social awareness 👥�
This has it all; a company way over its head but so arrogant that they thought they could do it all. It’s a company that operated in countries that it did not employ one person who could speak the local language. Facebook received thousand of complaints from young woman who received gross pictures from men. Their solution? They made it more difficult to report problems so they could claim that complaints had decreased. Their own studies proved the damages Instagram inflicted on young people and hid it. It’s hard to say the biggest issue; a company who put growth above everything else, the staggering incompetence of its leadership, malfeasance, cultural imperialism and a government willing to let them to operate without restrictions or accountability.
It’s well written and reads like a thriller, but it is sadly all too real. It’s an important read.
This book is the exact reason among many on why I stopped giving my time to Meta and all of its subsidiaries... this and the ad filled garbage newsfeed.
I still sadly have not deleted my account, because so many family members and friends use it, but once my experiment with Facebook Dating is over at the end of this year, I plan on staying logged out for good so long as I don't need to change any settings for Facebook Messenger, the last and final Meta owned property I will use and eventually get rid of as well.
I can't wait to be done with this company and their shady ways forever!
Interesting and discouraging information about the inside decisions made at Facebook. Many decisions made chose profits over what would be best. The algorithms to keep people engaged were subject to hacking and not representative of what was really trending. The reading was a little dry and I think it would have been helpful to have chapter titles. It was not always clear why a chapter ended and a lot of bouncing back and forth in time.
Facebook sounds like Jurassic Park…so caught up with what could be done, that it forgot to ask if it should be done. And then faced with its on culpability, prioritizes growth as an inviolable criteria over ethical oversight.
This book is fascinating but it's also very technical and dense. The real world events covered in it also show off the worst of humanity. If you have ever wondered why Facebook is the way that it is. I ma glad that I read it but I don't see myself reading it again anythime soon.
I've been fascinated by Meta for a while now and followed the Facebook Files quite closely when they came out. I've therefore been long awaiting this book and I've thoroughly enjoyed it. I think it did a great job highlighting how much of a complex system a huge tech behemoth like Meta is. Working in a tech company myself, you often deal with unintended consequences or second / third order effects of changes that you didn't anticipate. For instance, in Meta's case, the change to meaningful social interactions (MSI) was one of those where a change in what metric to optimize for led to more and more content extreme and provoking content to be rewarded on the platform.
In addition, i liked the content regarding Facebook's role on its own platform and the inherent conflict here. On the one hand, facebook wanted to be a neutral party letting the platform regulate itself. On the other hand, you could say that facebook might have some notion of an editorial responsibility giving the huge amounts of content produced and distributed to its platform. It's around these themes as well that Facebook's role in politics came to the fore. Republicans continued to accuse Facebook of its liberal biases and downranking more conservative content. Given what type of content was rewarded by Facebook's algorithm, it might have the been opposite as Facebook tried to maximize for engagement.
Lastly, I thought the notion of growth over everything was though provoking. Facebook wanted to prevent any engagement issues and would therefore time and time again pick the growth narrative over the integrity narrative. Eventually this led to more and more extreme content dominating the platform. Here it's also worth mentioning that it's extremely hard to measure the long-term effects of many changes that you make. Often you run 2 week experiments and decide to roll something out based on the metrics observed in that period and the difference between control and variant. However, many effects (e.g. integrity effects) might be more slow-moving and as such do not get sufficiently taken into account.
Ultimately, I thought this was a great book outlaying very well the intricacies of a huge tech company and philosophical debates that took places. It also highlighted Facebook's priority calls and how time and again it promoted growth over user safety and other negative societal effects. The structure of the book however did suffer at times as chapters could be repetitive or duplicative. I've put a summary below of some notes that I took on the various chapters that I read.
H1 - Company's official code of conduct: assume good intent - A team has redesigned Facebook's reporting system with the specific goal of reducing the number of completed user (moderation) reports -> integrity work - Jeff started his reporting talking to many insider's among those Frances Haugen
H2: - Facebook was supposed to be a neutral platform and that meant offering assistance to any major political party that wanted it - FB posted research in Nature that it could boost election turnout on a mass ssacle -> "I voted stickers" - Initial results that something is off: Philippines with Rodrigo Duterte's campaign had thrived. - Facebook offered both Democrats and Republicans the help of a dedicated staffer to help target facebook ads. The clinton people turned down the company's offer. - Questions were raised after the Trump win: Did facebook turn a blind eye to organized hate efforts on the platform? Did the company have an editorial responsibility to ensure factuality that it hadn't recognized? Were users trapped in filter bubbles? - Zuckerberg said that fake news wasn't a problem, but didn't have the data to back it up. Buzzfeed however showed that partisan hoaxes were trouncing real news in Facebook's news feed. This brought up so many questions 1) Why had fake news started posting huge engagement numbers? Did the success reflect user's preferences, manipulation or some flaw in Facebook's design? And was the problem worse on Facebook than the rest of the internet
H3: - Network effects would make social media a winner-takes-all game one in which rival platforms were both a threat and a hindrance to the free flow of information. Whenever a new social media mechanic emerged, the company would have to scramble to either copy it or acquire it. One way of looking at things is that the were really just buying time. - Facebook's inspirational slogan: If we don't create the thing that kills Facebook, someone else will. And with every hotn ew competing product feature sparking an existential crisis, Facebook was bound to become a bit more of a Frankenstein hastily sewn together from rival platform parts (Reshare from Twitter, stories from Snapchat, group video chats a la Houseparty). - As incentive structures were set up in such a way that shipping features was rewarded and everything was about 'Done is better than perfect", quality control was an afterthought. - Executives thought also that the platform would largely regulate itself and wanted facebook to remain neutral. Facebook's goodness was often used to give Facebook's quest for growth the sheen of moral imperative. Growth let to all sorts of questionable practices such as importing people's contact lists. - Newsfeed was built internally which gave a huge boost to engagement. This led to investments in ranking and ultimately AI/ML for advertising which was facebook's beachhead into this foray. The speed, breath, and scale of Facebook adopting ML came at the cost of comprehensibility. - Also, as the platform matured and its user base grew, Facebook scrambled to deploy the right metircs for success. From the start, DAU was prioritized. Later focus was on sessions, time spend, and metrics from the consumption and production of content. After years of focus on some of these metrics, some of them (e.g. friending) was growing increasingly unequal. - In 2008, Facebook had jumped headfirst into international expansion. This push came just as Sandberg was trying to impose cost controls and headcount freezes were ordered at for instance com ops. - Just before 2016 elections, articles came out that Facebook was suppressing conservative news. In reality, they might have been engaging in very light-touch human curation. The response was to lay-off the entire trending topics team under the guise of anything that had a human in the loop had to be got rid off. This also let to the disabling of Facebook's defense against hoaxes features which was part of the reason that fake news surged in the fall of 2016.
H4 - In the wake of the 2016 election, people both inside and outside of Facebook were asking a lot of questions that would have seemed implausible before. Russians had spent $100k on Facebook election ads some of page like ads to buy their way to mass scale and were an on-ramp to misinformation. - Cambridge Analytica came in Q1 2018 which discussed that Facebook had led outside developers access data from the profiles of Facebook users when they used the developer's products. But apparently Facebook had long provided major advertisers which similar data to target users. Similarly, the data that CA had was 2 years old, a lifetime for ad-targeting purposes. - At a similar time, Facebook was working on content recommendation systems. This led to rethinks of what Facebook was measuring or how it defined engagement. Originally, it was done on likes and later FB refocused to time spent but this led to more consumption of video and decrease overall content consumption. From then onwards, FB wanted to be on the lookout of 'one-way-doors' - During this rethinking, it came up that Facebook's most intense users might not be good for the platform. A class of FB power users could be trolls but many also favored edgier content and were more prone to partisanship. FB did not limit the influence any individual user could have on recommendations which was bizarre. This led to a proposal which would limit the influence of hyperactive users. This would hit far-right and far-left outlets hard and boost the distribution of mainstream news publishers. - This approach was called 'Sparing Sharing'. But plenty was not to like according to the public policy team as conservative publishers were already accusing FB of being biased against them. Changing the system now would upset an entire industry of FB-native publishers. What was FB's responsibility to the constituencies of the online society it had shaped. - In reshaping their recommendation, the team responsible had run-up against a question of ideology. FB had been built to produce maximum engagement. The idea that the platform needed to be protected against the excesses of the most enthusiastic users wasn't welcome. - In the wake of Cambridge Analytica experiment, FB's user trust measure (CAU: cares about you) was tanking. At the same time a lot of other scandals got jumbled together. Especially conservatives continued to accuse Facebook - which had 87% of their political donations go to democrats - of anti-conservative bias. This eventually led to Kaplan - a republican lobbyist - to be put in charge of adjudicating rules and opining on its mechanics.
H5 - Early on, FB would evaluate goal metrics by looking at averages. Similarly, it was only really looking at topline metrics, and no one was looking under the hood. - FB didn't want to manually review every post a classifier flagged as offensive and an intermediate solution was downranking. - There was an always-on debate on whether FB was responsible for what content it recommended or whether it was a neutral platform providing users with personalized content. Maximalists didn't think FB should enforce against all potentially unwholesome content and minimalists didn't think FB needed to honor for instance racists memes. - FB established the broad trust metric which worked at fighting sensationalistic content. However, behind the scenes, the dowranking again became watered down. - In 2018, FB changed the goals given to product teams and had them focus on a new metric: meaningful social interactions (MSI). In this metric, resharing became much more valuable than liking as was commenting. The concerns around this metric were though that it was going to make people fight.
H6 - Many publishers were exploiting communities on facebook as they could funnel tehir audience to content farms and collect payouts from Facebook revenue sharing programs. - With the change to meaningful social interactions, facebook started to reward more and more clickbait and junky content incentivizing publishers to going down that path.
El algoritmo De Facebook se ha vuelto opaco y los ingenieros ya no entienden su propio juguete. Pero sà se acumulan los indicios de que algo está terriblemente mal ahà dentro. SÃ, algo se ha roto.
In this day and age, many- if not all of us struggle with some form of addiction to our phones. Most of that time is spent on various social media apps, where we find ourselves at the mercy of an algorithm, being fed targeted ads, and streamlined content directed at us, all run by some supercomputer. Our moods, interests and opinions are formulated through short-form content, which is a dangerous concept. Facebook, now under the Meta umbrella, has unbridled influence across generations, and after reading the book, I can't help but feel worried about the ethics of the company.
Jeff Horwitz's book, "Broken Code," carefully breaks down the complicated workings of Facebook. It gives us deep insights into how the company knew its platform could be harmful and tended to ignore important issues to profit and grow the company. Throughout the story, Horwitz talks about instances where Facebook's systems for moderating content were misused and discusses how Facebook played a big role in shaping politics. The book takes a close look at the January 6th riot and shows how Facebook helped create an environment where political trouble brewed and eventually caused irreversible harm. Through stories and numerous examples, the author stresses how these platforms can have a big responsibility in shaping how people talk and think about politics, as seen in numerous events over the last 10 years including the 2016, and 2020 elections. He also talks about Facebook's questionable dealings with powerful leaders and its ongoing struggle with its own infrastructure preventing it from growing. The book delves into the internal dynamics of Facebook, but it doesn’t seem to provide explicit conclusions or actionable suggestions to address the platform's problems. Instead, Jeff states all of the facts and his experiences and leaves the interpretation up to the reader. This doesn't diminish the significance of the facts presented, and I found I preferred it. I was able to formulate my own opinion and was able to comprehend the situation. The book illuminates the potential societal harm stemming from unchecked growth and algorithmic power, emphasizing the profound responsibility these platforms bear in shaping public discourse.
I would choose the adjectives alarming and illuminating to describe the novel because they capture the dual nature of my experience with the book. The terms have some similarities, but one represents a feeling of worry, and one represents more of an acknowledgement of how much I learned, not necessarily negative. The term 'alarming' reflects my reaction to the discoveries the book presented, in unravelling the inner workings of Facebook. The discussion about content moderation, algorithmic manipulations, and the platform's role in shaping political landscapes was worrying, given my susceptibility to those things as someone who spends a significant amount of time on Meta. The other term I chose 'illuminating' signifies the learning aspect of my discovery, separate from feelings of nervousness and worry I harboured. The book shed light on how a big tech firm like Facebook operates, something which I previously had little knowledge of. It was an enlightening experience, making me acknowledge the complexity and depth of the digital world I navigate daily, which I know so little about.
I found that understanding the complicated technical jargon was a particularly challenging aspect of reading the novel, often times feeling lost, given my previous lack of knowledge in the field, however, it wasn't too pressing of an issue. I appreciated that during these passages where things would get confusing, Horwitz always made an effort to explain in the simplest terms for the reader to comprehend.
An implication from the book is that the unbridled growth and influence of tech companies like Facebook pose significant risks to society. The book serves as a wake-up call, showcasing how decisions made over coffee and donuts can have an enormous effect on us, and how we are influenced everyday. We have seen previously in the Cambridge Analytica scandal the lengths companies are willing to go to maximize profit and growth, and we have to be vigilant to not have our privacy violated. In a new age of technological advancements and quickly developing Artificial Intelligence, I find myself a little worried about how vulnerable I may be to manipulation in my life, including how my content is sorted, how vulnerable my information is, and how safe I am from cyber hacks. However, I believe if there is a precedent set by holding companies accountable for their actions which violate the public's privacy, we can have a much brighter future as a global population.
I would recommend this book to anyone interested in understanding the internal workings behind these big tech firms and the impact their decisions can have on our society. It provides valuable insights into the challenges posed by platforms like Facebook and encourages readers to critically examine the role of technology in shaping our digital landscape. This book is great for anyone, but having a little background knowledge of algorithms and other aspects of the technology behind social media would certainly make the reading more comprehensive.
The book taught me the intricate ways in which social media platforms can influence political events and public discourse. It showcased the need for increased scrutiny and regulation to ensure these platforms prioritize ethical considerations over unchecked growth. To end, I believe this quote represents the significance of Jeff's discoveries.
"Efforts to engineer growth had inadvertently rewarded political zealotry... And the company knew far more about the negative effects of social media usage than it let on."
In the book Broken Code, Jeff Horwitz explores the various dangers and harmful secrets hidden by Facebook from its beginning to now. The book explores themes such as human trafficking and drug cartels, teen mental health, suicide, political interference and backlash, the spread of misinformation, favouritism towards high profile individuals or ‘VIP’s�, and the internal struggles of combatting these issues with limited resources. Each section focuses on a different aspect of Facebook; one of the sections focuses on the aftermath of the 2016 United States election, the 2020 United States election, fake news, Donald Trump, ‘stop the steal�, and foreign hyperactive users. There is a section on mental health, a section on human trafficking and drug cartels, a section on AI and recommendation algorithms, a multiple sections on the Civic Integrity Team. If I had to use two adjectives to describe this book as a whole, I would use ‘shocking� and ‘mind-blowing�. The fact that there was so little care put into maintaining quality and fairness by the executives of Facebook and the fact that they only cared once public opinion of them went sour is mind blowing to me. I learned a lot about the failings of AI and algorithms. Because we humans are the ones to configure and apply the values, these algorithms still have human error and are thus still capable of failure. This is evident in the many incidents throughout the book. One issue that was highlighted in the book was that the algorithm team pushed their algorithms out to the main branch with very little testing and used untested numbers for weighting interactions. This led to disproportionate recommendations that valued hyperactive bot accounts more than real people. This led to weird and sketchy posts being displayed more to regular people. These algorithms and the attempt to manipulate recommendations is a constant throughout the book in many different forms. I did find the book slightly hard to read. It was not that difficult to the point of unreadable, but I do think the number of technical terms and dates can be a bit off-putting to some. I personally am fine with it, but it does make the book harder to read. I also understand that it is necessary for the book to explore everything Facebook has done because a lot of the issues are technical. An implication of what happened with Facebook is that this could and is most likely happening in all the major social media companies without our knowledge. We just do not know and will not be able to know without insider access. I recommend this to anyone who likes to understand the inner workings of huge corporations. My favourite quote from the book is “The story of Facebook’s integrity work is, in many respects, the story of losses� (308). It is my favourite quote because of how it summarizes the entire book in one sentence. The Civic Integrity Team faced challenge after challenge and were hindered by the executives and Mark Zuckerberg every step of the way.
This entire review has been hidden because of spoilers.
It's no surprise that Facebook has knowingly and repeatedly done things in the name of driving growth, profits and engagement that are damaging to people. This includes emotional damage to teenagers, promoting anger and polarization, turning a deaf ear to election tampering, permitting proliferation of hate speech and misinformation and, in more than one country outside of the United States actually causing deaths by allowing people to use Facebook to plot and execute campaigns of murder and oppression. It would be shocking if it were not old news. This book shows how in case after case when Facebook has been presented with a choice of a strategy that would promote calmer, happier more constructive use of the platform at the expense of engagement, the company has always chosen more engagement. Every move in the opposite direction has been a lame, watered down choice that was more about PR than combatting the huge and obvious problems. This has been going on for years, and it gets worse and worse. Maybe it isn't the result of evil intent by bad actors, but following a capitalist imperative to maximize profits or hiding behind a claim of defending free speech when it is convenient to do so cannot excuse what has happened.
Notwithstanding the parade of horribles presented by Mr. Horwitz, he remains optimistic that the problems can be fixed. I have my doubts. First of all, the company needs to be broken up and forced to be interoperable with other platforms. Facebook's monopoly position allows it to disregard much of the damage that it does. It seems to me that there is clear consumer harm directly related to the company's monopoly position so that even if it isn't price harm, Facebook is in violation of the Sherman Act even under the prevailing consumer harm standard. Second, there have to be limits on free speech, and they have to be consistently and fairly enforced. I'm not talking about curbing defamation or not shouting fire in a crowded theater. It's more than that. This is something new and different and even more compelling. In a case where a monopolistic company actively promotes forms of speech that tear at the fabric of society, it has to be curbed. If you want to be Federalist Society originalist in interpreting the Constitution, I don't think that it is much of stretch to say that Hamilton and Madison and Jefferson would have been appalled by Facebook and would not have hesitated for a second in saying that it needs to change its ways or face consequences.
Horwitz provides an illuminating history of how Facebook has gotten to be so toxic, the engineers who devised strategies to counteract it, and the executives who prioritized corporate interests over harm prevention. For the people who very closely followed the Facebook Files and the congressional hearings, there may not be a whole lot of new material here. However, as I had severely limited my news intake during that time for mental health reasons, I missed most of the news coverage at the time. This book clarified and gave details into the why and how behind the Facebook scandals. Broken Code is also useful because it discusses Facebooks� scandals and missteps in chronological order (mostly) and puts those events in context. Even people who have already read all the news coverage may benefit from this insight.
The saddest thing for me was to learn that Facebook has known all along how dangerous their platform is and has had the tools to stop Facebook from breeding such viral vitriol. Before reading this book, I had been under the impression that we still didn’t have the knowledge or tools to address the problem of social media. But no � we do. It is just a bunch of selfish executives in denial that stand between us and fixing the issue. While it is certainly not the only reason, I directly blame Facebook for contributing to the world instability and the vicious national discourse currently going on and it makes me very angry. I can only hope that we will be able to course correct in time to stop as much harm as possible. Sadly, I am a little too cynical to believe it will actually happen.
I've been a Facebook junkie for literal years. I've fallen into the dopaminergic scrolling trip more times than I care to admit. Now that I know how the system works, I see that more than likely, I was consuming content by foreign actors and having arguments with international troll farms. That Facebook does not consider algorithmically-fueled election denial a risk to democracy anywhere in the world is enough of a jolt to boot my butt off of these platforms. That Zuckerberg and his lieutenants chose to use the company's massive resources to cover their tracks instead of cleaning up the dumpster fire they made of their platform gives me the same level of disgust for them that I have for the harmful garbage they let loose on the world.
I really encourage every person that consumes social media to read this. It is fast-paced and clicks along like a spy novel.
Broken code by Jeff Horwitz pulls back the curtain on the inner workings of Facebook. Horwitz details how decisions made by the leadership of Facebook affected real world politics, communities, and misinformation.
It's interesting to see how many of the decisions at Facebook are made with the backdrop of company politics. For example, the ad sales department is continually working against the policy and moderation departments. On the one hand, Sale's goal is to increase views on ads to increase profit while the policy department's goal is to decrease the changes of misinformation and violence being spread through the platform's algorithm. As you can guess, the company chooses profit over people more often than not.
My main criticism of this book is that it reads like a history textbook "this person did this, this person did that" and violence insued. The way Horwitz presented the information left out the human impact of these decisions and left me, the reader, feeling cut off from the consequences of Facebook's decisions.
I recommend this topic if you are looking to better understand Facebooks history and how social media algorithms can have a oversized impact on all aspects of current life.
This was an expansive book about the state of the "move fast and break things" culture and the inherent structural issues of Facebook. I thoroughly enjoyed reading this, and the information is consistent with other books I have read, matching discussions on Facebook's role in teen body issues and polarization in The Anxious Generation by Jonathan Haidt and the role of Facebook's meaningful social engagement metrics on flaming the conditions leading to the Rohingya genocide and Anti-Muslim attacks in India.
What really gets me is this book points to several network-based bias-neutral fixes to the algorithm to restrict the rampant spread of misinformation. All of these were shot down in the name of quarterlies and user base growth. Based on Mark Zuckerberg's consistent attitude toward the desire for growth, reluctance to limit bots, and current rightward trajectory, I doubt we will get any fixes to the features that drove the stark polarization we see today.
At some moments I would have rated it four stars, sometimes only one or two, so 3 is nicely in the middle. The storyline is compelling, leaving you with astonishment, with a view on many people entering employment in Facebook to do good, but failing everytime. And all, relunctingly admitting that it's worse than they had anticipated. For the message this books brings, a full five stars. However, the fuzzy writing leaves you every chapter to think what the message of that chapter is, introducing yet another player, with no clue why. A vague timeline progressing in the book is abandoned every chapter again to go back in time. I found myself losing a grip on what I was reading, how I got there and where in the story we were. Could have been written more clear and more composed.
Fascinating investigative journalism based upon a wealth of insider documents and interviews with current and former Facebook staffers which describes the many ways that the platform has been exploited, misused, and undermines user experience. It shows how warring factions within Facebook were able to squelch integrity measures. The inescapable subtext throughout this book is that not even Facebook understands its own product and vulnerabilities.
It also exposes a fundamental flaw of Mark Zuckerberg, namely that more connections/interactions are better and that human nature is inherently good. The scandals that Facebook has endured and will continue to endure show that neither is an absolute.
"Everyone" knows how to use Facebook, right? Perhaps, but this book focuses on the behind the scenes workings of various utilities. Not all that easy to understand. Or, maybe it is the stunning ability of outsiders to leverage Facebook to spew their really crappy goals. Can Facebook stop this garbage? Maybe, but there was an ultimate measure of success handed down by senior management. The measure was users. The mandate was to do nothing that would impact user counts. After all...more users generated more ads, which was the ultimate measure of success. This led to really harmful information populating Facebook.
This book was confirmation bias that facebook (and instagram) is toxic, and they are aware of their toxicity, and don't care. They are completely aware that people with nefarious intents use the platform to spread propaganda that ultimately effects world events- inciting riots, influencing american politics and even elections, hate speech against marginalized communities worldwide, spreading fake news to incite emotional responses by people, depression and anxiety among many users- especially middle school aged girls... the list goes on. facebook's leadership is gross and to the whistleblower that released "the facebook papers" on her way out the door- we thank you.
Reading this book was a mistake. I'm now convinced that the world is 100% on fire and nothing will help.
However, the book was informative and well written so it's not the fault of the book that the world sucks.
I'm on Instagram but not Facebook, and the algorithm only sends me stories about which wedding dress someone should choose and about people who have suffered catastrophic injury or the unexpected death of a spouse or the untimely death of an infant born with debilitating health conditions, so no, I don't get poor self image from Instagram but only confusion as to who the algorithm thinks I am.