‘An Ugly Truth’ Lays Bare Facebook’s Murky Business Practices
In their ground-breaking book, An Ugly Truth: Inside Facebook’s Battle for Domination (The Bridge Street Press, an imprint of Little, Brown Book Group, 2021), award-winning journalists Sheera Frenkel and Cecilia Kang lay bare all that Facebook has done to dominate the social networking space.
Cecilia Kang
Based on “more than a thousand hours of interviews with more than four hundred people”, including employees, advisers and investors and “never-reported emails, memos, and white papers involving or approved by top executives,” this book demonstrates how Facebook keeps its agendas and algorithms hidden.
Facebook CEO Mark Zuckerberg refused to engage in interviews Frenkel and Kang conducted, and COO Sheryl Sandberg “cut off direct communication” as well. That is unbecoming of a company whose leaders claim they are committed to bringing people “together”.
Sheera Frenkel
The making of Fakebook
During his run for United States presidentship, Donald Trump called for a “total and complete shutdown of Muslims entering the US.” Frenkel and Kang note that the post got “more than 100,000 ‘likes’ and was shared 14,000 times.” In more than one way, Facebook helped Trump become president, Parmy Olson, a former Forbes staff member, argued in her article, too. Zuckerberg, Olson notes, denied the claim saying the American people “made decisions based on their ‘lived experiences,’ not false news they read online.”
It begs recall of what Zuckerberg wrote in 2015, informing his fans and staff: “In recent campaigns around the world—from India and Indonesia across Europe to the United States—we’ve seen the candidate with the largest following on Facebook usually wins.”
Frenkel and Kang find in their investigation that top Facebook executives were divided on the proposal to pull down Trump’s post. Facebook’s hesitation and defence can be best represented by what its ace developer and the vice president of augmented and virtual reality, Andrew Bosworth “Boz”, said, “We connect people. Period.”
In an internal memo titled ‘The Ugly’, Boz wrote, “Maybe it costs someone a life by exposing someone to bullies. Maybe someone dies in a terrorist attack coordinated on our tools. And still we connect people. The ugly truth is that we believe in connecting people so deeply that anything that allows us to connect people more often is de facto good.”
The ‘de facto good’ Boz talks about had—and continues to have—severe consequences. Facebook’s role in inciting violence in Myanmar is not hidden from anyone; the firm has admitted its complacency. Over the years, it has not only become a breeding ground for manufacturing and spreading fake news but has also impacted its users’ mental health. The Wall Street Journal, in a series of articles, documents that “Facebook knew harm of its services, including teenage girls saying Instagram made them feel worse about themselves.” The New York Times notes that these articles are based on a “trove of Facebook documents” leaked by a whistle-blower.
How Facebook makes money?
Frenkel and Kang critically and minutely cover the coming together of Zuckerberg and Sandberg, their shared vision of dominance, and the journey Facebook has traversed in its 17 years. In light of their claims and findings, it is necessary to bring things into perspective.
Zuckerberg ridicules allegations that Facebook’s users buy into fake news to make voting decisions. He seems convinced that Facebook’s monumental innovation ‘News Feed’—Frenkel and Kang dissect its conception and roll-out exceptionally well—does not play any tricks to make sensational and violence-inciting news go viral. Then why don’t he and his top leadership come clean on the algorithm behind how the company samples the ‘news’ and ‘updates’ floated on users’ News Feed, the authors argue. Why does Facebook never share its processes, nor their intended outcomes?
Facebook, whose ad revenue is more than all American newspapers combined, “has never answered how many content moderators they have onboard,” the writers observe. Olson shares the sentiment, writing that people have been demanding “Facebook to take a greater responsibility for its growing role as an ‘editor’ of web content. They say Facebook should start holding itself to editorial standards—making its newsfeed less personalised (gasp) by showing users opposing ideas and comments.”
In his introduction to The Real Face of Facebook in India: How Social Media have Become a Propaganda Weapon and Disseminator of Disinformation and Falsehood, (Paranjoy Guha Thakurta, 2019) by Cyril Sam and Paranjoy Guha Thakurta, NewsClick founder-editor Prabir Purkayastha writes, “For Facebook, virality is money. The more people see, the more potential users they have, and the more they can sell to political parties and business.” It is a dangerous precedent. Michael Nuñez, a writer who covers Facebook and social media for Forbes, cites Frenkel and Kang, and believes the same. He says, “Facebook made it seem like it was a black box, like there were systems making these decisions, but actually there were employees who were making critical decisions without any oversight.”
Frenkel and Kang write: “While Facebook sat on a repository of data encompassing every article, photo, and video shared on the platform, and while it could, if it wanted to, calculate the exact number of minutes users spent looking at those pieces of content, no one at the company had been keeping track of false news.”
Isn’t it baffling that a corporation keen to know ‘what’s on your mind’—and allowing you to share it all—does not have a disclaimer to any of its ‘organically curated’ content on News Feed? Interestingly, Facebook cannot keep a tab on fake news, but when it comes to muzzling dissent, it does have a ‘rat catcher’—Sonya Ahuja.
“Ahuja’s department,” Frenkel and Kang observe, “had an eagle’s-eye view into the daily workings of Facebook employees, with access to all communications. This kind of surveillance was built into Facebook’s systems: Zuckerberg was highly controlling of Facebook’s internal conversations, projects, and organisational changes.”
What news does Facebook not want to be outed—how it controls and what it does with your personal data, for example. Remember Cambridge Analytica? A whistle-blower revealed that this firm, “funded by Trump supporter Robert Mercer and led by Trump’s senior adviser Stephen K. Bannon, had created a new level of political ad targeting using Facebook data on personality traits and political values.” Other threats also define Facebook’s work culture; for example, the authors reveal instances of sexual harassment at work.
It is easy to argue that the hate that Facebook spews and sells to make money is deeply rooted in its policies, ethics, and culture. Its compromising business standards are a reflection of its single-minded agenda to dominate the social networking space. It is, however, not lost on United States federal agencies looking to control tech firms. The Federal Trade Commission (FTC), in August 2021, “refiled its antitrust case against Facebook, arguing the company holds monopoly power in social networking.” It’s the second such attempt in a single year. But will the FTC be able to hold the organisation to account? The real question facing Facebook users is, are they comfortable using a platform that puts people’s lives in danger, cripples democracies, and muzzles dissent?
Saurabh Sharma is an independent journalist. The views are personal.
Get the latest reports & analysis with people's perspective on Protests, movements & deep analytical videos, discussions of the current affairs in your Telegram app. Subscribe to NewsClick's Telegram channel & get Real-Time updates on stories, as they get published on our website.