Last year, Congress considered, but didn’t pass, the Kids Online Safety Act and failed to update the Children and Teens’ Online Privacy Protection Act. The Kids Online Safety Act would have required companies to undergo regular external audits of the risks their platforms create for minors, implement stronger privacy settings for minors and bear the burden of ensuring they mitigate foreseeable harms like posts boosting substance abuse, eating disorders or suicide.
There is plenty of evidence to indicate that time spent on social media is linked to increased depression, anxiety and self-esteem issues for kids and teens.
There is plenty of evidence to indicate that time spent on social media is linked to increased depression, anxiety and self-esteem issues for kids and teens. In some cases, cyberbullying and harassment have even been linked to children’s deaths. Congressional inaction has exacerbated the dangers faced by children and teens who use social media and led to a predictable vacuum.
Now more than three dozen states are seeking to fill that vacuum by suing Meta, the parent company of Facebook, Instagram, WhatsApp and Messenger. They accuse Meta of violating a federal privacy law and of violating their state consumer protection laws. Separate but similar lawsuits make nearly identical claims against Meta based solely on state laws.
California and dozens of other blue and red states have alleged that Meta has lied about the safety of its social media sites and therefore violated state consumer protection laws. Specifically, the states allege that the social media sites are products that are designed to “induce young users’ compulsive and extended use.” Essentially, the lawsuit claims that Meta’s platforms are designed to addict minors, and others, to their products (much like slot machines) and that once they are on those platforms, the algorithms present minors with dangerous and harmful content.
The states claim that Meta knows full well that its platforms are causing children harm. Two years ago, a former Facebook employee leaked research showing that using Instagram can directly harm teenage girls. The research specifically involved content seen to detrimentally affect girls’ body images and self-esteem. CEO Mark Zuckerberg responded by saying that the research had been misconstrued and that his products weren’t designed to promote harmful or angry content.
The plaintiffs have also claimed that Meta collects the personal data of its minor users, in violation of the federal children’s online privacy law. Meta’s guidelines provide that it collect from a minor’s accounts only data that is “needed for their device to work properly.”
Ideally, minors would stay off of social media entirely, or for at least for the vast majority of their time. But we all know that isn’t going to happen. Social media sites are too tempting and ubiquitous. The next best option would be for companies to take real and concrete steps to implement safety and privacy protections for younger users. And while Meta has implemented some reforms, like increasing parental controls and age verification technology and removing certain sensitive content, there is more that can be done. Content moderation that quickly removes posts that support bullying, harassment or suicide should be more robust.
“We want this activity to stop using its misleading algorithms,” Nebraska Attorney General Mike Hilgers said at a news conference Tuesday. “We want to make sure it applies with the COPPA,” he said, referring to the Children’s Online Privacy Protection Act.
When children are facing real harm and private companies don’t appear to be remedying that harm fast enough, lawmakers should step in.
Not every problem requires a legislative fix. But when children are facing real harm and private companies don’t appear to be remedying that harm fast enough, lawmakers should step in. This is also a situation that cries out for federal regulation, because we need one uniform set of standards throughout the country. Simply by virtue of how the internet works, having a patchwork of laws that vary by state could present real administrative hurdles.









