Meta Platforms must face a lawsuit brought by Massachusetts, accusing the social media giant of intentionally designing Instagram features to addict young users and misleading the public about the platform’s impact on teenage mental health. A ruling by Suffolk County Superior Court Judge Peter Krupp, made public on Friday, October 18th 2024, rejected Meta’s attempt to dismiss the lawsuit filed by Massachusetts Attorney General, which claims that the company violated state consumer protection laws and contributed to a public nuisance.

Meta had argued that the lawsuit was invalid under Section 230 of the Communications Decency Act of 1996, a federal law that protects internet companies from being held responsible for content posted by users. However, Judge Krupp ruled that Section 230 did not shield Meta from accusations of making false statements about Instagram’s safety, its protections for young users, or the effectiveness of its age verification systems meant to block users under 13 from accessing the platform.

This lawsuit certainly sheds light on why Instagram has recently ramped up efforts to promote its platform as safe and secure, particularly for young users. We have recently seen a strong attempt at promoting the various parental control features, age verification tools, and privacy settings to give parents more oversight over their children’s activity on the platform. As discussed in other blog posts, these measures include the ability to limit screen time, monitor who their children interact with, and restrict access to certain content. Instagram’s public-facing narrative has emphasized protecting younger users’ well-being, which is a stark contrast to the allegations in the Massachusetts lawsuit that claim the platform’s design is purposefully addictive and harmful to teens.

Instagram has faced backlash from mental health advocates, researchers, and regulators, all pointing to the growing evidence that social media use—especially among teenagers—can contribute to anxiety, depression, and other mental health issues. Instagram highlighted its commitment to safety, releasing features such as the “Take a Break” reminder and notifications promoting mental well-being, as well as rolling out campaigns designed to encourage positive social interaction.

The Massachusetts ruling follows a similar decision in federal court in California earlier that week, where a judge rejected Meta’s request to dismiss lawsuits brought by over 30 states. Those states accused the company of contributing to the mental health crisis among teens by making its social media platforms intentionally addictive. Massachusetts, which filed its lawsuit in October of last year, 2023, was one of the states to pursue legal action in state court rather than joining the federal lawsuits. The case gained attention for claims that Meta’s CEO, Mark Zuckerberg, had disregarded concerns about Instagram’s potential harm to users. The lawsuit argues that Instagram’s features, including push notifications, post “likes,” and the infinite scrolls, were specifically designed to exploit teen’s psychological vulnerabilities. They mainly capitalized on their “fear of missing out” to boost profits.

The contradiction between Meta’s public messaging and the internal practices revealed in the lawsuit provides a deeper context for why the company has been increasingly vocal about protecting young users. The litigation from Massachusetts and other states may be driving Meta to proactively showcase its commitment to safety as a way to mitigate growing regulatory pressures and public outcry. However, if these lawsuits prove successful, it could push Instagram to overhaul not only its public messaging but also its platform’s fundamental design in ways that truly prioritize the mental health of its younger audience.