No matter how terrible situations appear on the ground, it’s even more harmful below.
That’s the problematic takeaway from the Facebook Papers, an array of internal Facebook official documents leaked by whistleblower Frances Haugen. These were also examined by 17 newsgroups. Their news paints an image of a corporation damaged beyond restoration, that, despite scandal after scandal, however, has the power to shock.
Let’s go through the revelations that are summed up below and that indicates how horrible the situation is:
1. Facebook’s executives avoided their own employees’ cries for reform
The community who work at Facebook are not a monolith, and the Atlantic summarises that corporation documents indicate some workers calling out actual harm caused by the company only to be stroked aside by higher-ups.
A staff at Facebook wrote this in the aftermath of the Jan. 6 attack on the U.S. Capitol, “How are we anticipated to avoid when administration overrides research-based system decisions to adequately serve people like the organizations stimulating violence today,”. It was even added by “Rank and file employees have done their role to specify transitions to enhance our platform but have been vigorously held back. “
2. Mark Zuckerberg personally authorized censoring anti-government content/posts abroad while masking as a free speech advocate in the U.S.
Mark Zuckerberg, Facebook CEO, doesn’t need to be in the business of censoring political speech, he has frequently claimed. And however, according to the Washington Post, he’s personally done almost that when it suits his firm’s bottom line.
The News highlights an incredibly awful example of the CEO’s deception in Vietnam, where, according to people in 2020, aware of the decision, the CEO himself made the signal to censor anti-government posts on behalf of the ruling Communist Party.
Vietnam is a significant market arena for Facebook. A 2018 Amnesty International assessment found Facebook reaped nearly $1 billion in annual revenue from the nation.
3. Facebook’s own researchers were stunned by its algorithm’s suggestions
That the Facebook algorithm turned up divisive content is now widely comprehended evidence. Even so, the awful nature of that content still has the strength to shock even Facebook’s own researchers.
According to a report from New York Times, on 4th Feb 2019, a Facebook researcher established a new/fake user account to discern what it was like to experience the social media site as an individual residing in Kerala, India. With that, he observed that for the succeeding 3 weeks, the new user account was regulated by a simple rule to follow all the recommendations produced by the Facebook algorithm to join the groups, watch the videos and explore new pages on the site. He added that he has seen more images of dead people in the preceding 3 weeks than he has seen in his entire life. It is evident according to internal Facebook leaked documents, that the investigation applied bare just how off-kilter Facebook’s recommendation policies are.
The researcher also added: “Following this examination of the user’s News Feed, I’ve discerned more images of dead people in the past three weeks than I’ve seen in my whole life total,”.
4. Facebook settles politics front and center when implementing its own rules
Documenting has indicated that Facebook’s CEO stressed the bitterness of Facebook’s conservative users, and therefore would privately interfere on behalf of right-wing pundits and publishers. Leaked papers encompassed in the Facebook documents and accentuated by Politico indicate that even Facebook’s own researchers were familiar with this and frequently named it out internally.
In 2020, a Facebook data scientist wrote in an internal presentation named Political Influences on content polity, “Facebook routinely makes exceptions for influential actors when implementing content policy,”. “The basic code for implementation and policy involves consulting Public Policy on any crucial changes, and their input regularly safeguards powerful constituencies.”
Notably, here the Public Policy team is inferred to the team that includes Facebook lobbyists by the researcher.
What’s more, Facebook researchers corroborated that Facebook CEO himself often got encompassed in agreeing on whether a post should wait or go implying a two-tier system of implementation dependent on unwritten rules.
In numerous cases the ultimate judgment about whether a well-known post oversteps a specific written policy is formulated by senior administrators, occasionally Mark Zuckerberg. If our decisions are aimed at an application of a written policy then it’s ambiguous why administrators would be reviewed. If rather there was an unwritten factor to our policies, i.e. to protect susceptible constituencies, then it’s natural that we would like administrators to have ultimate decision-making power.
5. It took a threat from Apple to induce a Facebook full-court press against human trafficking
Human traffickers have utilized Facebook’s mechanisms to power their work. As CNN reveals, a 2020 internal Facebook paper made evident that Facebook was long familiar with this fact.
The internal Facebook report reads in part, “[Our] platform facilitates all three phases of the human exploitation lifecycle (recruitment, facilitation, exploitation) through complicated real-world networks.”
And also, while human trafficking has long been explicitly banned on Facebook, it put up with Apple threatening to eject Facebook and Instagram from the Apple App Store in 2019 for Facebook to summon the kind of response one might have anticipated much earlier. The document studied by CNN reads, “Removing our apps from Apple platforms would have had potentially serious effects to the business, depriving millions of users of access to IG & FB,”.