A Meta Analysis of Facebook’s “Perfect Storm”

Harper Treschuk

On October 28, Facebook Founder Mark Zuckerberg announced that Facebook and its subsidiaries would be rebranded as the parent company Meta.

The past month morphed into the “perfect storm,” for Facebook, or as I should now say, Meta, the renamed parent company of the Facebook family apps. Facebook postponed their plan to release a version of Instagram curated for children aged 10 to 12, introduced as Instagram kids, on September 27 following intense scrutiny from parents and lawmakers. On October 4, Facebook and its associated networks suffered a widespread outage for a period of six to seven hours. Following internal research published in the Wall Street Journal that demonstrated Instagram’s toxic effect on the mental health of teen girls, Frances Haugen, a former Facebook employee, testified before the United States Senate Committee on Commerce, Science, and Transportation on October 5. Haugen’s testimony emphasized the extent to which Facebook has placed its “immense profits over people.” 

On October 28, Facebook Founder Mark Zuckerberg announced in a keynote speech that Facebook and its associated subsidiaries would be rebranded as the parent company Meta. He laid out a forward-thinking vision for future platforms in which users would be able to see a concert with their friends halfway across the world, experience immersive gaming, and step into virtual rooms with their five senses. The brand change, he claimed, reflects how Facebook is expanding from its social networks to focus more heavily on this vision of a “metaverse” that would enhance social experiences. Delivering his speech against a rather disconcerting projection of what the metaverse would look like, Zuckerberg stated that the “next platform” will be “an embodied internet, where you are in the experience, not just looking at it.”

I cannot see into Mark Zuckerberg’s subconscious, but I beg to disagree that the envisioned metaverse will benefit social connections. After all, media already profoundly shapes our reality. Internal research revealing that 66% of teen girls experience negative social comparisons on Instagram is a sign that this “embodied experience” has already arrived. If a curated Instagram feed pushes teen girls toward heightened mental health symptoms and eating disorders, as Ms. Haugen testified, they are not just passive consumers of content. We have already reached a point where users are part of the experience, sustaining a business model that generates appallingly high profits. Facebook must act on its own internal research, and do some honest thinking about how its algorithms and products are reinforcing detrimental feedback loops for its most vulnerable users. 

Announcing the new vision of the metaverse in the wake of scrutiny over its lack of transparency and its failure to take action on its own internal research seems the most overt example yet of how Facebook has not earned our trust. As Ms. Haugen testified to the United States Senate, Facebook has prioritized its profits over the safety of its users, specifically in the area of teen mental health. 

During the hearing, Ms. Haugen compared Facebook’s tactics to that of Big Tobacco: an industry that knows its products are harmful, but withholds that research and information from the general public as well as regulators and stakeholders throughout the world. The research has revealed trends of “problematic use” in Instagram users: an experience similar to addiction to other substances or products in which users use the platform to self-medicate despite the reality that it makes their feelings worse. Especially for teenage users, whose brains are still in the process of developing and lack the same level of self-regulation as adults, opening the app intensifies feelings of self-doubt, inadequacy, or self-loathing, but they still struggle to stop using it. 

While addiction to technology may stem from underlying issues or mental health struggles, Facebook’s own research reveals that the unique features of Instagram create the “perfect storm” to push teenagers into darker places. While other social media platforms not in the Facebook universe such as Snapchat and TikTok reinforce different harmful behaviors, the algorithmic and metric-based platform of Instagram can be particularly problematic.

Following Facebook’s acquisition of Instagram in 2012, the company changed the chronological order of the Instagram feed (the posts users see when they open the app) to an algorithmic system. Every time users interact with a certain kind of post, that gives the algorithm insight into what content it will show the user next time they open the app. While Facebook claims that the algorithm promotes meaningful interactions—the algorithm learns what content you, as the user, supposedly wish to see—these algorithms have introduced a variety of issues.

For one, Facebook’s artificial intelligence is only capable of detecting 10-20% of problematic content which means that the algorithm can unintentionally amplify a person’s preference for risky behaviors and misinformation. Additionally, the app can push users toward unhealthy extremes, in prioritizing content that is more likely to provoke intense emotional reactions. In the internal research, a 13 year old female user who began looking at healthy recipe content on the app was then shown posts promoting anorexia and a thin body image. Not only do some teenagers in the study trace eating disorders to Instagram, but also 13% of British users and 6% of American users trace thoughts of suicide to the app. This problematic use is truly heartbreaking. 

The presence of likes, comments, and reshare functions on the platform also reinforces negative social comparisons, which 66% of teen girls in the study stated that they experience. Perhaps the most underlying issue of Instagram is the dissonance between reality and what we perceive on the app. We look at the images shared by our friends, casual acquaintances, or even celebrities and we feel insecure or inadequate because what flashes before our eyes is often a filtered, carefully-selected sliver of reality. Perhaps this is the darkest truth of the metaverse — it influences our perceptions in a way that the general public is not fully aware, because Facebook has operated in a place of opacity for so long. 

But the future of Facebook and its subsidiaries is not black and white. Ms. Haugen articulated that while “Facebook wants you to believe in false choices,” the reality is more nuanced and a “safer, more enjoyable social media is possible.” Social media can allow friends and family members to connect and share experiences with each other across distances, and aid small businesses in advertising their products on the digital marketplace. It has connected people from the farthest reaches of the world to a network of ideas and information. Facebook can receive more Congressional oversight without either sacrificing the principle of free speech or letting virulent misinformation circulate, Facebook can release more data for public scrutiny instead of insulating themselves behind an opaque wall, and Facebook can move toward human-centered rather than algorithm-centered interactions on its platform without sacrificing all of its profits or an enjoyable user experience. 

At the highest level, Facebook is making a “false choice” between profits and the people. I can envision a more sustainable version of social media in which algorithms do not push teenage girls into eating disorders. My vision is not compatible with Mr. Zuckerberg’s metaverse vision unless his company is willing to look beyond the metric of short-term profits. Without transparency, I cannot trust Facebook when it says that its platform of Instagram Kids was proposed to make social media safer for the hundreds of thousands of children 10-12 who misrepresent their age on Instagram (a social media designed for those 13 and up). I worry, as my younger sister turns 8 this year and more of her friends express an interest in phones and technology, that this initiative is designed to get children from younger and younger age brackets hooked on the app, safeguarding its next generation of consumers.

Ironically, one of the ideas Instagram CEO Adam Moisseri proposed in the wake of the Wall Street Journal article backlash was to create a “Take a Break” feature in which problematic users would be encouraged–by the app itself–to reconsider their time on social media. In an even more metacognitive fashion, Instagram recently launched a global branding campaign called “Yours to Make” to app users, highlighting the authentic connections that can be formed on Instagram. But a campaign intended to showcase the good in Instagram appears superficial unless Facebook at large addresses the negative research it itself conducted. And as someone who maintains a small-scale presence on Instagram to connect over shared activities, I am beginning to question the time I spend on a platform that recognizes its own capacity for toxic relationships and yet has failed to make structural changes. 

The solutions Ms. Haugen proposed include an increase in Congressional oversight, requiring Facebook to be more transparent in releasing data and more forthcoming in how that data is influencing company policy. Additionally, many lawmakers at the hearing expressed support for amending Section 230 of the Communications Decency Act, regarded as the most influential law for big tech today. Section 230 limits the liability that online services such as Facebook have for the third-party content on their sites, essentially giving them legal immunity for the content that is displayed and moderated. 

According to Statista, 3.51 billion people globally use at least one Facebook family product each month; a company that has such overwhelming access to people’s thoughts and perceptions, as Ms. Haugen noted, should receive this scrutiny from Congress as well as the general public. 

Following a month of postponed projects, Congressional hearings, journalistic coverage, and outages, founder Mark Zuckerberg gave an optimistic speech in which he announced the company’s new name Meta, stating, “for me, it symbolizes that there is always more to build and always a next chapter to the story.” There is a next chapter to the story, but whether that narrative is built on long-term sustainability or short-term profits remains to be seen. Perhaps the name Meta contains the exposed truth: as long as the minds of algorithms control the perceptions of users, social media will continue to exert adverse influences on our society. Facebook should do a thorough meta analysis on its own research now that this reality has been brought to light.