Is Meta Legally Accountable for Harming Children on Social Media?

Yes…according to 41 states and the District of Columbia –the tech giant that owns Facebook and Instagram has harmed children through its social media apps. The 233-page federal complaint alleges that Meta has “profoundly altered the psychological and social realities of a generation of young Americans.” Plaintiffs argue that Meta is motivated by profit and in order to maximize its financial gains, Meta has “repeatedly misled the public about the substantial dangers of its Social Media Platforms.”

State officials involved in the suit claim that Meta violated consumer protection laws by deploying changes to their apps in order to keep children on their sites longer, making children addicted, and negatively affecting their well-being and mental health.

A joint lawsuit was filed by 33 states in the Northern District of California, and the attorney general for the District of Columbia along with 8 other states are filing separate complaints in local, state, and federal courts.

Recent Controversies and Lawsuits

The lawsuits are a culmination of a 2021 Wall Street Journal article that detailed leaked internal research by Facebook alleging that Instagram made body issues in teen girls worse. This leak led to an onslaught of efforts launched by legislators to scrutinize Meta’s safety practices and to pass privacy and safety regulations for children who engage with social media.

However, these lawsuits have struggled to gain traction with the courts, running up against First Amendment concerns. Additionally, there are obstacles in Congress with the House trying to broaden data privacy legislation for all consumers, while narrower Senate bills are focused on boosting protections specifically for children.

Meta is “disappointed” with the current lawsuit stating that “instead of working productively with companies across the industry to create clear, age-appropriate standards for the many apps teens use, the attorneys general have chosen this path…”

This isn’t the first time a major social media company has been in the hot seat over privacy concerns for children. Back in 2019, the FTC and the state of New York reached a $170 settlement with YouTube (owned by Google) due to the company’s practice of illegally collecting data from users younger than 13. Plaintiffs who won that settlement alleged that YouTube violated the Children’s Online Privacy Protection Act by illegally collecting user information from children in order to bolster their behavioral advertising business.

In 2022, after the leaked report from the Wall Street Journal, more than 1,200 families filed lawsuits against social media companies including TikTok, Snapchat, YouTube, Roblox and Meta. All of these lawsuits echo the same sentiment: social media companies prioritize profit over the mental health of children. Those cases are being litigated in the courts today.

Responsibility in the Digital Age

This raises the question: Are tech companies responsible for exacerbating anxiety, depression, and other mental health issues in teens? States argue yes; social media sites mislead consumers and expose them to inappropriate content. Some states such as Arkansas and Utah have gone so far as to pass laws banning kids younger than 13 from social media all together, and requiring teens younger than 18 to get parental consent to access these sites.

Things get murky when research concerning the connection between social media usage and mental health problems is taken into account. Some research suggests there is a correlation between excessive social media use and poor mental health in children, while other research suggests that social media is not inherently harmful to children.

In addition to a lack of convincing research, there are also questions surrounding the role of parents. Is it the responsibility of tech companies who run social media apps to taper their content so as not to expose young people to inappropriate information and dangerous messaging? Or is it the parent’s responsibility to protect their children from the dangers of the internet, and educate them on social media? Perhaps the ongoing litigation will provide courts with the opportunity to provide clarity on this matter.

Critics argue that companies have shirked their responsibility to protect their most vulnerable users –children. Big tech companies like Meta reject this argument, stating that they have taken numerous steps intended to make apps safer for children, “including giving parents tools to track kids activity, building in warnings that urge teens to take a break from social media, and implementing stricter settings by default for young users.” Does this go far enough? The 41 states and the District of are not convinced, but only time will tell how this matter will play out through the courts.

Interested in cases like this? Want to see other complaints that states are filing against large tech companies? Head on over to There, you can research through thousands of court county documents and filter through cases similar to this one.