Ad for vote.gov

California Sues Meta For Damaging Kids’ Mental Health

Is the controversy over social media just another moral panic, or something to really panic about?

PUBLISHED NOV 5, 2023 5:17 P.M.
Share this:  
How does social media affect the minds and well-being of children? According to California and other states, not well.

How does social media affect the minds and well-being of children? According to California and other states, not well.   Ri_Ya / Pixabay   Pixabay License

Attorney General Rob Bonta—along with attorneys general from 32 other states—filed a lawsuit against Meta Platforms Inc.,  for, the lawsuit alleges, posing a danger to children and doing nothing about it in the interest of further profits. Eight other states and the District of Columbia also filed their own, similar lawsuits against Meta on the same day.

Meta—founded as Facebook in 2004—“has repeatedly misled the public about the substantial dangers of its social media platforms,” according to the lawsuit filed on Oct. 24.

Meta, the multinational technology corporation that owns the dominant social media sites Facebook and Instagram, as well as several other sites and online services, is the fourth largest company in California in terms of revenue, according to the 2023 Fortune 500 rankings. The tech behemoth raked in $116 billion in revenue for the 2022 fiscal year. 

Meta—founded as Facebook in 2004— “has repeatedly misled the public about the substantial dangers of its social media platforms,” according to the lawsuit filed on Oct. 24 in Northern California District Court. “It has concealed the ways in which these platforms exploit and manipulate its most vulnerable consumers: teenagers and children.” 

In fact, the lawsuit asserts, Meta deliberately designed its social media platforms in ways that it knew would damage kids, deliberately choosing to “exploit” the “vulnerabilities” of children by creating “dopamine-manipulating” algorithms, employing visual filters that “promote young users body dysmorphia,” and using “social comparison” features such as “likes” that the company knew to “harm young users.”

Just Another Moral Panic? Or Something to Panic About?

Of course, moral panics about children being corrupted by various manifestations of pop culture are at least as old as pop culture itself. (A moral panic is widespread fear that some specific, presumably evil thing poses a threat to a culture’s shared values and safety.)  

Are claims that social media—Instagram and Facebook in particular—has caused “sweeping damage” to “the mental and physical health of our nation’s youth” just another in a historical parade of moral panics?

In 1954, German-born psychiatrist Frederic Wertham published his book Seduction of the Innocent, in which he argued that reading comic books led to juvenile delinquency and other types of deviant youth behavior. His book was praised by some of the leading intellectuals of his day, including Clifton Fadiman, sociologist C. Wright Mills, and groundbreaking child psychologist Bruno Bettelheim. Seduction of the Innocent caused a crisis in the then-thriving comic-book industry and led to the creation of the “Comics Code,” a set of rules for self-censorship in comics publishing. 

A few years later, along came a new phenomenon called rock and roll music, which came with a series of moral panics of its own, over such perceived evils as, in segregated 1950s America, “race mixing.” Late in the 1960s, rock music spawned fears of “free love” and illicit drug use. 

In the decades that followed, video games became the youth phenomenon that sparked fear that teenagers were being turned into a generation of crazed killers, especially following the 1999 mass shooting at Columbine High School in Colorado. 

Are claims, as articulated in the lawsuit, that social media—Instagram and Facebook in particular—has caused “sweeping damage” to “the mental and physical health of our nation’s youth” just another in a historical parade of moral panics? Or is there evidence that Meta is more like the big tobacco companies, engaging in a carefully orchestrated campaign of deliberate deception to protect their massive profits by lying about the danger posed by their product?

What Is Social Media? A Brief History

The term “social media” first appeared in Tokyo, Japan, back in 1994—used in relation to something called Matisse, which was described by one of its creators as “an internet-connected multimedia text-based VR client which restores primacy to a spatial and social metaphor, to create an inhabitable space within which any internet-accessible simple or hypermedia object may easily be embedded, viewed and otherwise manipulated.”

MySpace became so popular so quickly that in 2004 a precocious Harvard student by the name of Mark Zuckerberg who had developed his own, similar app offered to sell his creation to eUniverse for $75 million.

Phew. 

Fortunately the definition of social media, as well as social media sites themselves, have been simplified and streamlined since then.  While it’s difficult to come up with a single, precise definition, in general social media is any type of online technology that allows people to instantly share and communicate their thoughts in real time with multiple other users, while also receiving thoughts and ideas from those others. 

Social media sites, unlike for example, news web sites like this one, or business sites, consist primarily of “user-generated content.” That is, what you read, view and listen to on social media does not come from the creators or owners of the site itself, but from the site’s users, who post new content often with great frequency and almost always for free.

It should be obvious why this is an attractive business model. Huge numbers of people who will work for you voluntarily for no pay is a hard proposition to resist. In 2004, a platform known as MySpace became the first social media site to reach one million users every month, proving that this model could, in fact, be viable. But MySpace was merely the winner in a crowded field. Social media was already seen as the proverbial next big thing. Before MySpace there was Friendster, created by Canadian programmer Jonathan Abrams. Back in 2002, Abrams' idea was revolutionary—an online app that allowed people to quickly communicate with friends, reconnect with old friends, and even make new ones. 

A group of venture capitalists thought that was a cool idea as well, and poured millions into Abrams’ site. But as venture capitalists often do, they pushed for the company to grow as fast as possible rather than making sure it worked as well as possible. Abrams was soon ousted, and the bugginess of the app left Friendster doomed.

The Rapid Rise of Facebook

But the rapid rise of Friendster inspired an online marketing company called eUniverse to come up with its own social media site. That turned out to be MySpace, which became so popular so quickly that in 2004 a precocious Harvard student by the name of Mark Zuckerberg who had developed his own, similar app offered to sell his creation to eUniverse for $75 million.

The Facebook creator, who by then was the company’s CEO and the most powerful man in social media, saw Instagram as a “threat” to Facebook’s market dominance. So in 2012, he bought it—for $1 billion.

The company’s CEO, Chris DeWolfe, spurned Zuckerberg’s offer and a year later eUniverse sold MySpace for $580 million to News Corporation, the multinational media conglomerate run by Rupert Murdoch and which then was the parent company of Fox News. (Murdoch split News Corp from 20th Century Fox in 2013. The latter kept ownership of Fox News.)

Zuckerberg’s app, Facebook, took what MySpace had started and simply did it better.

“The world had been trained by MySpace that social networking was interesting, but the actual product had been perfected by Facebook,” Mike Jones, a former MySpace CEO from the NewsCorp era, told Business Insider in 2015

As MySpace faltered by 2008, Facebook skyrocketed to 608 million users by the end of 2010, the year Jones was hired in a vain attempt to revitalize MySpace, and more than a billion two years later. By 2023, Facebook had nearly 3 billion users and was valued $59 billion.

As Facebook was quickly conquering the social media world, a 27-year-old Stanford grad named Kevin Systrom was teaching himself how to write programming code so he could create an app that would allow people to share photos and stories about one of his favorite activities—drinking bourbon

Once he secured about $500,000 in venture capital for his proposed app, called Burbn, he realized that one of the app’s features was much more valuable than any other—it allowed people to quickly and easily share photos. With his first employee, 25-year-old Mike Krieger, Systrom stripped out all of the bourbon-related features, refashioning it solely as the first social media app strictly for sharing photos. 

When he did that, and launched the app now called Instagram in 2010, he caught Zuckerberg’s attention. The Facebook creator who by then was the company’s CEO and the most powerful man in social media, saw Instagram as a “threat” to Facebook’s market dominance. So in 2012, he bought it—for $1 billion.

As of 2023, Instagram—headquartered like Facebook on Meta’s Menlo Park campus—had almost 2 billion users

Social Media’s Real Business: Data

One of the most scathing allegations against Meta (i.e. Facebook and Instagram) in the multi-state lawsuit is that the company “unlawfully collect(s) the personal data of its youngest users without their parents’ permission.” Under the Children’s Online Privacy Protection Act, a federal law passed in 2000 and updated in 2013, online sites may not collect personal information from users under 13 years old without first obtaining permission from the kids’ parents.

Based on Facebook’s 2018 balance sheet on file with the SEC, Facebook appeared to take in $35.2 billion by monetizing users’ personal information. That would be 63 percent of all the money Facebook earned that year.

Facebook is and has always been free to sign up for and use. But as early as 2011, author and media theorist Douglas Rushkoff observed that “We are not the customers of Facebook, we are the product.” The reason Facebook and other social media sites hoover up large volumes of personal information on their users is simple: that data is worth a lot of money. Social media companies monetize it by offering advertisers the advantage of targeting specific consumers based on that data.

That is, Facebook and Instagram do not, strictly speaking, “sell” personal data. Instead, the Meta companies, and other social media firms, use your data to hyper-target users for advertisers. If, for example, a company that makes organic, vegetarian, gluten-free gerbil food wants to advertise only to consumers most likely to be extremely health conscious and also own pet gerbils, Facebook can instantly pour through its vast bank of user data to identify only those people. It then charges the organic, vegetarian, gluten-free gerbil food company for the privilege of having its ads placed on their Facebook feeds, maximizing the customer’s ad dollars—and Facebook’s revenue.

How much is your personal information worth? Practically speaking, that’s impossible to say. One rough calculation by the magazine Popular Mechanics in 2018 put the value of at $33 per Facebook user, annually. In 2022, the online data privacy company LetAlone estimated the annual price of a single user’s data at $900 per year. A 2019 “deep dive” by researchers who published their findings in Washington Monthly determined that same value to be $202 per year.

The Washington Monthly study concluded that, at least based on Facebook’s 2018 balance sheet on file with the Securities and Exchange Commission, Facebook appeared to take in $35.2 billion by monetizing users’ personal information. That would be 63 percent of all the money Facebook earned that year, according to the study.

And how much money did Facebook users earn from the sale of their personal data? Zero.

What You Give Up When You Log On

Of all the social media apps that collect and monetize personal data, the Meta companies do the most if it. And it should be noted that it’s not only social media apps—52 percent of all apps collect and share personal data, according to a 2021 study by the privacy firm pCloud

According to the same study, Facebook and Instagram are the most prolific data-merchants. Instagram, pCould found, shares 79 percent of all data it collects with third parties. Facebook shares 57 percent. No other app shares more than 50 percent.

What is this “data” that Facebook and other sites scoop up and turn into big bucks? The most obvious is the info you give up voluntarily simply by signing up—your name, age, email address. And if you chose to answer all of the questions typically asked by Facebook, you also give the company your employment history, whether you’re married, single or dating someone, your personal interests (e.g. your favorite bands, sports teams and so on).

But Facebook certainly does not stop there. If you have location services turned on in your phone or computer, the company tracks where you are and where you go, what business you visit. Through the content you post, Facebook can also determine and catalog your religious affiliations, political views, who your friends are, where they live and what their interests are.

And of course if you have ever purchased anything through Facebook (for example) including a Facebook ad or “promoted” post, the company has a wide range of information on your financial status and activity.

Because most Facebook and Instagram users post photos, Meta also knows what you look like and what your friends look like. Facebook ‘s facial recognition software uses that information to track its users activities even more accurately, and has accumulated one of the world’s largest image databases from its user data. In 2021, however, Facebook announced that it would shut down its facial recognition operation, according to a New York Times report.

The company said that it deleted about one billion face “prints” in their facial recognition database.

Though the lawsuit alleges that Facebook and Instagram collect data from users 13 years old and younger, Facebook terms of service have long banned users of that young age. In 2018, the company said it was cracking down on underage users, saying it would cancel their accounts. Previously, the company policy had been only to “investigate” accounts that were reported to be from underage users.

In July of 2021, Meta announced that it would restrict advertisers from targeting the youngest users using any data beyond age, gender and location. They would no longer allow targeting of children based on “interests and activity,” which is Meta’s catch-all term for the massive trove of other data it collects on each individual user. In 2023, the company said it was placing another restriction on advertisers, now preventing them from targeting users 16 and under by gender as well.

Pay No Attention to That Algorithm Behind the Curtain

The Oct. 24, 2023, lawsuit alleges that Facebook and other Meta apps use their algorithms to feed kids material that is dangerous to their mental and even physical health, all in the name of increasing “engagement.” In social media jargon, engagement is more or less the same as “paying attention.”

Why is engagement an all-important commodity for social media sites? Because the more a user is “engaged,” the more likely that user is to absorb advertising.

Engagement is measured by “likes,” the number of comments a piece of posted content receives, and other less visible metrics such as the amount of time a user spends scrolling through Facebook, watching videos or reading posts.

Why is engagement such an all-important commodity for social media sites? Because the more a user is “engaged,” that is the more time and attention the user devotes to a social media site, the more likely that user is to absorb advertising and other targeted content that pays the company money. In other words, Facebook and Instagram are designed to maximize engagement, which is a fancy way of saying, to addict you to their sites, even if you’re a child.

What is this mysterious “algorithm” that hooks social media users young and old? An algorithm is a set of procedures used by a computer to perform a task. For social media sites, that task is keeping people on the site. The algorithm does that by deciding which pieces of content it shows to each individual user—automatically curating what shows up on your screen.

The algorithm does this in part based on a user’s own preferences, but also by feeding content that it thinks will keep the user engaged. Of course, the algorithm does not really “think.” The thinking is done by the human beings who write the algorithm. And according to internal Meta documents leaked by former Facebook engineer—and algorithm specialist—Frances Haugen, the two largest Meta apps—Instagram in particular, where American teens spend 50 percent more time than Facebook—hook young people by feeding them a flood of pictures of what one young user called “chiseled bodies, perfect abs and women doing 100 burpees in 10 minutes,” according to a Wall Street Journal exposé based on Haugen’s revelations. (A burpee is an intense version of a pushup.)

How the Algorithm Harms Kids

Facebook’s own internal research, according to the Journal report, showed that this barrage of unrealistic body images not only hooked young people, girls in particular, but also damaged their mental health by making them feel bad about themselves.

According to a 2023 study published by the American Psychological Association, teens who cut back their social media use by half for even just a few weeks experienced “significant improvement” in their self-image when it came to their weight and overall physical appearance. But according to the internal Facebook research reported by the Journal,  “32 percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse.”

Of American teens who said they had suicidal thoughts, 6 percent said that Instagram was the reason they felt suicidal. For British teens, the number was 13 percent.

The lawsuit by Bonta and his fellow state AGs alleges that the Meta algorithms, “based on maximizing the time that young users spend on its Social Media Platforms,” have not only harmed kids, promoting “eating disorders and body dysmorphia,” but that Meta knew it was causing this harm—and kept right on doing it.

“Meta still refuses to abandon its use of known harmful features—and has instead redoubled its efforts to misrepresent, conceal, and downplay the impact of those features on young users’ mental and physical health,” the lawsuit alleges.

The lawsuit seeks unspecified monetary damages from Meta, but perhaps more importantly, California and the other states in the suit want the court to order Meta to stop breaking the law by pursuing its scheme to hook kids.

As for Meta, in a statement the company said that it was “disappointed that instead of working productively with companies across the industry to create clear, age-appropriate standards for the many apps teens use, the attorneys general have chosen this path.”

Support California Local

$10 • $25 • $50 • Our Impact
Explainer

Long form articles which explain how something works, or provide context or background information about a current issue or topic.

This article is tagged with:
Related Articles
Elon Musk is now in control of the world's most influential social media outlet. What happens next?
What Elon Musk Really Wants From Twitter
Why does the world's richest human want to be in charge of the world's most influential social media platform?
Elon Musk says he wants to buy Twitter to protect ‘free speech.’
Billionaires Buy the Media: Elon Musk is Only the Latest
The richest of the rich are increasing their control of information outlets in the U.S. and globally
Can YouTube be held liable for a deadly terrorist attack if its algorithm recommended ISIS videos?
SCOTUS Takes on Section 230, the Online Free Speech Law
Justices hear arguments in case to weaken online freedoms and change how tech does business.
The California mental health crisis is tied to both homelessness and rising crime.
UPDATE: California’s Mental Health Crisis: How We Got Here
The making of Gov. Newsom's plan to help get mentally ill Californians into treatment.
Join Us Today!