A jury in Los Angeles found Meta and Google liable in a case brought by a woman identified in court as Kaley. Kaley claims that she became addicted to Instagram and YouTube by using these platforms since childhood. This case is one of the first lawsuits to hold social media companies accountable based on the design of the platforms rather than user-generated content.

The jury determined that both companies acted negligently and failed to warn users about the risks of prolonged platform use. They also ruled that specific design features, such as recommendation systems, notifications, and autoplay options, played a role in the mental health issues reported by the plaintiff.

Compensation Paid in Meta and Google Social Media Addiction Case

The jury awarded three million dollars in damages and supported additional punitive damages. Liability was shared between the two companies, with Meta receiving a larger share. The final amounts of the punitive damages have not yet been confirmed in source documents.

In a separate case, a jury in New Mexico ordered Meta to pay 375 million dollars after finding violations related to child safety protections. Mark Zuckerberg testified at the Los Angeles trial, and internal company documents were presented as evidence.

Legal Arguments from Both Sides in the Los Angeles Trial

The plaintiff's legal team argued that the platforms are designed to promote compulsive use, making it harder for younger users to disconnect. Kaley shared that after nearly constant use of both platforms, she developed body dysmorphia, depression, and suicidal thoughts.

Attorneys representing Meta and Google stated that the plaintiff's mental health issues were more closely related to her personal circumstances than to platform use. They also questioned the classification of social media addiction as an officially recognized medical condition.

Wider Legal Context and Future Implications of the Ruling

This case is part of a broader legal strategy focused on the design of the platforms, rather than what users share. This approach seems aimed at overcoming legal protections under Section 230 of the Communications Decency Act, which generally shields platforms from liability for third-party content.

Across the United States, hundreds of similar lawsuits have been filed by parents, school districts, and state officials. Some previous lawsuits involving TikTok and Snap were settled before this trial concluded.

The ruling in Los Angeles may influence how courts assess the connection between platform design choices and user harms in other ongoing cases. Meta and Google have not yet indicated whether they will consider appealing this decision.