Addicted By Design?
A LANDMARK VICTORY
A Los Angeles jury delivered an unprecedented verdict in favour of a young woman, Kaley, who sued Meta and YouTube over childhood social media addiction.
Jurors concluded that Meta and Google intentionally designed addictive platforms that harmed Kaley’s mental health. The court awarded her $6 million in damages, split equally between compensatory and punitive damages, citing “malice, oppression, or fraud” in how the platforms operated.
WHO PAYS WHAT?
Meta, which owns Instagram, Facebook, and WhatsApp, will bear 70% of the damages, while Google, YouTube’s parent company, will cover the remaining 30%.
Both companies have said they disagree with the verdict and plan to appeal.
KALEY’S STORY
Kaley began using YouTube at age six and Instagram at nine, encountering no effective age restrictions. She testified that her usage replaced time with family and contributed to anxiety, depression, and body dysmorphia, including obsession with appearance influenced by filters.
THE CASE AGAINST PLATFORMS
Mark Lanier, her lawyer, compared this design to dopamine-seeking “slot machines”, and said that YouTube and Meta were operating like “digital casinos” with their endless scroll features fuelling dopamine hits and thus, addiction.
They also claimed companies targeted young users because they were more likely to remain long-term users.
INSIDE THE COURTROOM
During testimony, Instagram head Adam Mosseri described Kaley’s 16-hour daily usage as “problematic” but denied it proved addiction.
Meanwhile, Mark Zuckerberg defended Meta’s policies, including restrictions on users under 13, while acknowledging challenges in enforcement.
A SECOND VERDICT INTENSIFIES PRESSURE
A day before the Los Angeles ruling, a New Mexico jury found Meta liable under Consumer Protection Law in a case linked to child sexual exploitation.
The court concluded Meta had misled the public about platform safety while reducing protections. Damages were set at $375 million.
Evidence cited internal concerns over expanding end-to-end encryption, which limited detection of abuse material and led to a sharp drop in reported exploitation cases.
PUSH FOR REGULATION
Experts say the verdict reflects growing public frustration with social media. A similar ruling in New Mexico recently found Meta liable for exposing children to harmful content.
In the US, proposed legislation like California’s Assembly Bill 1709 aims to block access for users under 16.
Australia has introduced restrictions on children’s social media use, while the UK is exploring a ban for under-16s.
TAKEAWAYS FOR TECH COMPANIES
Audit Product Design for Safety
Platforms must review features that could be seen as addictive and clearly document how user safety was considered in design decisions.
Strengthen Protections for Minors
Robust age verification, parental controls, and limits on algorithmic feeds are essential as regulations tighten and legal risks grow.
Evaluate and Justify Encryption Choices
Privacy features like end-to-end encryption must be assessed for child safety risks, with clear documentation of decisions and trade-offs.

Content writer, Passionate wordsmith, Ready to Craft Engaging Stories
Select WRN as your preferred source on Google Search
