The Future. Los Angeles Superior Court Judge Carolyn Kuhl refused to dismiss a series of product liability lawsuits brought by hundreds of government officials and parents of minors against Meta, TikTok, Snap, and Google. That’s surely making Silicon Valley sweat, opening the industry up to the same type of multi-billion dollar settlements that kneecapped Big Tobacco and Big Pharma. While a jury will ultimately decide the outcome of the cases on this level, expect the lawsuits to update their way to the Supreme Court.
Big Tech is on the path to getting a big slap on the wrist.
- The heart of the lawsuit is a “public nuisance theory” that purports the design of social platforms, not just the hosted third-party content, has a negative impact on kids.
- That includes TikTok’s continuous scrolling feature and the inability to turn off autoplay on videos, Instagram’s filters and lenses that amplify body image issues, and every platform’s lack of robust parental controls.
- She also points out Meta’s own buried research on how Instagram’s UX harms teen mental health.
By letting the lawsuits advance, Judge Kuhl is defining the limits of Section 230 — the legal shield that protects platforms from being liable for the content hosted on them. When the “provider manipulates third-party content in a manner that injures a user,” Section 230 no longer applies. She called the allegedly addictive features “defective design.”
Kuhl even refers to a federal lawsuit that said Snap could be liable for damages after a speedometer feature on the app potentially encouraged speeding and contributed to a fatal crash.
Unfortunately, the consequences may be in the code.