Social media companies YouTube, TikTok, Facebook, Snapchat and Instagram lost a key battle this week in a national lawsuit over their role in the youth mental health crisis.
The suit, which represents hundreds of parents and children as plaintiffs, alleges the platforms were designed to be addictive, subsequently harmed young users and failed to adequately warn parents of the risks involved. Those dangers include anxiety, depression, suicidality, body image issues and eating disorders, according to Lieff Cabraser, the law firm representing the plaintiffs.
The defendants, which include the platforms’ parent companies Alphabet, Google, Meta, Snap and ByteDance, argued in response that they are legally immune from the plaintiffs’ claims and asked the court to dismiss the litigation. They cited Section 230 of the Communications Act of 1996, which has long protected Internet companies that post third-party content online from legal liability in many circumstances.
On Tuesday, a federal judge in California partially rejected the companies’ request in a lengthy ruling, meaning critical aspects of the case will move forward.
4 Tips for a Successful Digital Detox
This includes claims that certain features of the platform, such as filters to blur imperfections and photos that have been edited but not labeled as such, are product defects for which the companies should be held liable. Lawyers for the plaintiffs argue that such features expose young consumers to unrealistic body ideals and cause them to compare themselves negatively to others.
In addition, U.S. District Judge Yvonne Gonzalez Rogers found that failure to adhere to protective limits on duration and frequency of use, robust verification processes to determine a user’s age, and effective parental controls and notifications were also product design defects for which the companies could potentially be held liable.
“Today’s ruling is a significant victory for families who have been harmed by the dangers of social media,” the plaintiffs’ attorneys said in a statement. “The mental health crisis among America’s youth is a direct result of these defendants’ deliberate design of harmful product features.”
Rogers noted that social media companies have a duty to their users when they create certain products that may be defective, and that they can be sued for negligence related to those defects.
However, Rogers rejected some of the plaintiffs’ claims, including that social media companies can be held responsible for private messaging features, the timing and grouping of notifications about other users’ content and algorithmic recommendations that match minors with adults. It ruled that Section 230 shields companies from liability because these actions fall squarely within the realm of “publicity,” which federal law largely protects.
Reuters reported that a spokesman for Alphabet, the parent company of Google and YouTube, described the allegations in the lawsuit as “simply false,” adding that it had always worked to protect children. A TikTok spokesperson told Reuters the platform has “robust safety and parental control policies.”
In addition to this nationwide lawsuit, a group of 41 states and the District of Columbia sued Meta last month, alleging the company intentionally hooked young users on its platforms, which include Facebook, Instagram and WhatsApp. School districts in the US have also sued social media companies on similar grounds. In June, a Maryland school district sued the parent companies of Instagram, Facebook, TikTok, Snapchat and YouTube for creating harmful product features that have created a “mental health crisis among America’s youth.”