The main message of the book is that Facebook is a uniquely irresponsible social media company. In the big wide world of social media, we can assign a lot of blame to the corporate leaders for putting profits ahead of social responsibility. Other social media sites, such as Twitter, Reddit, and TikTok, have been used to spread lies, hate speech, and other malicious content around the world, but the authors single out Facebook as the worst player in the social media ecosystem.
Their main points are: First, the business imperative shared by the leaders of Facebook is to keep the company growing: more users across more countries combined with constantly growing advertising revenues. The company’s overall strategy and most of the critical platform policy decisions are all focused on growth, to the exclusion of any responsible blocks on potentially dangerous platform content.
Second, Facebook hoards information: about its users, its customers, and any data about its platform usage and policies. We have no idea how many people have politically radical or racist content delivered in their Facebook news feed, and there is a similar blackout on who receives targeted misleading advertising (particularly political advertising).
Third, the amount of content posted on Facebook is so massive that there is no way that Facebook is able to police the content worldwide. For example, they don’t have enough language experts in many of the countries where Facebook operates to support the ability track down and block hate speech, especially within subcommunities that speak a minority language in that country. (This problem was documented in Myanmar - Facebook posts in containing hate speech and fabricated misinformation against the Rohingya Muslim minority were rampant. “Facebook wanted to discourage people from bullying across the platform, they said, and they believed that the same set of tools they used to stop a high school senior from intimidating an incoming freshman could be used to stop Buddhist monks in Myanmar from spreading malicious conspiracy theories about Rohingya Muslims.” One problem: At the start of the problems, Facebook had exactly one Burmese-speaking contractor in their worldwide team who supported the Myanmar operations - for a country where hundreds of languages are spoken.) Facebook might or might not have appropriate rules in place to prevent abuse of the platform - but it is clear that their ability to enforce any rules is totally inadequate to the task.
Fourth, Facebook has acquired other social media companies as part of its growth strategy, and after each acquisition, Facebook has consistently remolded the operating models of those platforms. After acquiring Instagram and WhatsApp, which didn’t have advertising businesses as part of their operating model, but Facebook leadership wanted to incorporate these two platforms in a way that would better “monetize” their operations. Of course, at the time of each merger, Facebook promised the founders of Instagram and WhatsApp to keep the apps independent, partly to get regulatory approval and forestall any antitrust questions from US government agencies.
Fifth, as Facebook continues to refine its algorithms, their incentive is to keep people watching Facebook content, *not* to run a responsible social media platform. During the 2016 election campaign, Facebook changed their algorithms to weight the content of family and friends above all else - which had the unintended consequence of &lquo;deprioritizing content from accredited news sites like CNN and the Washington Post. Users no longer saw posts from news sites featured prominently on their News Feeds, but they continued to see the false and hyper-partisan news that their family members and friends were posting.” One interesting political question: “Does Facebook favor populists?” Both Modi in India and Duterte in Philippines used Facebook prominently in their political campaigns, and Trump followed that pattern in 2016. In the final months of the 2016 US election campaign, Facebook was inundated with “vitriolic and polemical content that sought to divide Americans.”
Sixth, privacy concerns always took a back seat to growth. Of course, the famous Cambridge Analytica scandal in 2018 has been widely documented, highlighting the ability of Facebook customers to harvest Facebook data without the permission of users. The authors trace back the dangerous privacy holes to 2012, when Facebook staff alerted senior executives caused when Facebook merged in the implementation of a system called Open Graph (to provide data to support external app developers). The original security alerts were complete ignored. In fact, the source of the original security alert, Sandy Parakilas, explained in 2018 that in his sixteen months working at Facebook, he never saw “a single audit of a developer where the company inspected the developer’s data storage.” He said, “Facebook didn’s want to make the public aware of huge weaknesses in its data security.” In other words, we don’t have any idea what other security and privacy issues are present in Facebook’s platforms.
Finally, the book documents case after case where Facebook’s senior leaders seem to be afraid to stand up to Mark Zuckerberg. It is clear that Zuckerberg’s total financial control (with a majority of the voting stock) means that there is no hope of consistent rational company policies. Everything is at the whim of one man. Lower-level rebellion gets sidelined (employees who complain with policy decisions are reassigned or fired), and senior leadership is mostly very quiet publicly.
In the final chapter of the book, there is some reporting about Facebook’s new Oversight Board - an independent panel created by Facebook to “adjudicate the thorniest freedom of expression cases.” The authors point out that this external panel has just become another way for Facebook to abdicate responsibility. Facebook leaders asked the Oversight Board to rule on the Facebook ban of former president Trump, but that policy choice allowed them to ignore the pleas to ban certain autocratic world leaders from the platform (such as the presidents of Turkey and Venezuela).
The authors also explain the ongoing antitrust suits against Facebook, and the ongoing defense strategies used by Facebook: asserting that Facebook has plenty of competition, keeping employees as quiet as possible, spending a lot of money on Washington lobbyists, and resisting any attempts at government regulation of its business.
The bottom line for the authors: “One thing is certain. Even if the company undergoes a radical transformation in the coming years, that change is unlikely to come from within. The algorithm that serves as Facebook’s beating heart is too powerful and too lucrative. And the platform is built on a fundamental, possibly irreconcilable dichotomy: its purported missing to advance society by connecting people while also profiting off them. It is Facebook’s dilemma and its ugly truth.”