A Los Angeles jury has awarded $6 million to a young woman, identified as Kaley, who sued Meta and Google over social media addiction that severely affected her mental health. Kaley revealed she started using Instagram at age nine and YouTube at six, eventually developing anxiety, depression, and body dysmorphia. At times, she spent up to 16 hours daily on these platforms.
The jury determined that both Meta and Google intentionally created addictive features, such as Instagram’s infinite scroll, and failed to safeguard minors from harm. Meta will be responsible for 70% of the damages, with Google covering the remainder. Both companies have announced plans to appeal, arguing that teen mental health issues are complex and cannot be solely attributed to one platform.
In a related development, a New Mexico state court jury recently ordered Meta, the parent company of Facebook and Instagram, to pay $375 million for knowingly violating state laws designed to protect children online. The 2023 lawsuit, filed by New Mexico Attorney General Raúl Torrez, followed an undercover operation where a fake profile of a 13-year-old girl attracted extensive contact from online predators.
The jury found that Meta’s platforms breached the state’s unfair practices act by prioritizing profits over user safety. Torrez described the verdict as “a historic victory for every child and family,” citing internal documents that revealed company executives were aware of the risks. Meta stated it would appeal and highlighted the safeguards it has implemented across its apps.
The trial’s next phase, set for May, will address whether Meta created a public nuisance and should finance programs to mitigate the harms caused. Officials are also advocating for app design reforms, including stricter age verification processes and enhanced measures to prevent predator activity.
These consecutive rulings mark a significant turning point in holding social media companies accountable for protecting minors. Experts have likened the cases to the Big Tobacco lawsuits of the 1990s, referencing internal documents that show executives prioritized growth and engagement over user well-being.
Outside the Los Angeles courthouse, parents and advocacy groups celebrated Kaley’s victory as a landmark moment for corporate accountability. Similar lawsuits involving platforms like Snap and TikTok are ongoing, testing legal approaches that circumvent Section 230 protections by focusing on app design flaws.
Analysts suggest these rulings could prompt systemic reforms, including mandatory age verification, stricter safety standards, and restrictions on addictive features. Internationally, countries such as Australia have already imposed limits on social media use by children, while the UK is trialing bans for users under 16.
Torrez emphasized that New Mexico’s legal strategy could establish a national and global precedent, demonstrating that tech giants cannot evade responsibility regarding child safety. With additional trials planned in California and other states, social media companies may face increased scrutiny, heavier financial penalties, and mounting pressure to redesign platforms to better protect younger users.
