Frances Haugen shook the public’s perceptions of social media when she anonymously filed complaints with federal law enforcement about Facebook Inc., now known as Meta Platforms, Inc. Haugen worked as a product manager at Facebook before quitting in May due to her concern with the company’s power, motives, and harm imposed on society.
In October, Haugen disclosed tens of thousands of pages of internal documents to the Wall Street Journal that she believed to prove Facebook knew about and chose to ignore the harm their platforms cause. She has testified before Congress twice, most recently on December 1.
A specific issue Haugen has with Meta is its use of algorithms that drive popular features, like the main feeds in Facebook and Instagram, which impact people’s decisions and mental health. Facebook and Instagram users are starting to recognize the repetition of ads and genres of content in their feeds. This repetition is no coincidence, as it is the work of algorithms. Algorithms, or simple sets of instructions or logistics, are taken to an extreme at Meta, with billions of dollars spent to make even more money for the company. Meta’s advertising revenues were at an all-time high in 2020, at $84 billion. While Meta’s algorithms have proved successful for the company financially, they have also produced the exact outcomes Meta seems to want.
One of the many ways Haugen saw Meta’s apparent neglect for its users’ well-being was its unwillingness to take steps which would limit polarization and prevent the spread of misinformation. “Friction,” she describes, is a way Meta could prevent the spread of misinformation and false news. Friction is any intervention which would slow down the process of spreading information, such as pop-ups or tabs to ask the user to read before sharing an article. Haugen believes that these interventions can dramatically help reduce the number of falsified articles being spread.
The main focus of Haugen’s complaints is that Meta specifically fosters an extremely negative environment for its Facebook and Instagram users. Photos by Instagram influencers, whose content is often boosted by the platform’s algorithms, can easily be altered, giving the viewer an unrealistic idea of the human body. Five Israeli Ph.D. students created the app Facetune in 2013, which allows people to edit their faces and alter their bodies in an image. Facetune is so popular among social media users that it is currently being used as Meta’s case study on user acquisition.
“Within two years, their company, called Lightricks, had generated about $18 million in revenue from the 4.5 million downloads of Facetune, which in 2015 cost between $3 and $4,” according to estimates by Business Insider. These statistics support just how big the photo-editing boom is.
Meta’s algorithms control what people around the world see every day. The algorithms exist to keep the user entertained, but to what extent is entertainment manipulated? According to Haugen, “The mechanics of our platform are not neutral.” She believes that Meta increasingly polarizes people’s opinions. She connects this polarization to politics in America, and the divide between the “right” and the “left” in the media and on social media.
“User-generated content is something companies have less control over. But they have 100% control over their algorithms,” Haugen stated. “Facebook should not get a free pass on choices it makes to prioritize growth, virality and reactiveness over public safety.”
Haugen advocates for changes to Section 230 of the Communications Decency Act, which states, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” In essence, Facebook is currently protected by this law from being sued for what its users post.
In Henry Adams’ 1907 autobiography, he describes his fear of technology’s power: “At the rate of progress since 1800, every American who lived to the year 2000 would know how to control unlimited power.” This sentiment from 100 years ago, foreshadows the widespread control that Facebook and its social media platforms have on society.