logo

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Etiam posuere varius magna, ut accumsan quam pretium vel. Duis ornare felis

Hirtenstraße 19, 10178 Berlin, Germany
(+44) 871.075.0336
ouroffice@cortex.com
Gonzalez-Google-Social-Media-Algorithms-thefutureparty

Algorithms are about to take the stand in the Supreme Court

Gonzalez-Google-Social-Media-Algorithms-thefutureparty
Illustration by Kate Walker

Algorithms are about to take the stand in the Supreme Court

 

The Future. The upcoming Supreme Court case Gonzalez v. Google questions whether social media platforms are no longer protected by Section 230 of the Communications Decency Act, which says platforms won’t be “treated as the publisher or speaker of any information provided by another information content provider” — if they recommend something illegal, even accidentally. That may seem obvious in theory but a little more difficult in practice, with content moderation being an imperfect science at scale. Expect the whole industry to be watching this closely.

Publisher or speaker?
Gonzalez v. Google, set to be heard in February, could radically change how social media algorithms interact with users, according to Fast Company.

  • Filed by the family of American woman Nohemi Gonzalez, who was murdered in an ISIS attack in Paris in 2015, the case places part of the blame on YouTube for hosting and recommending ISIS videos that were used to radicalize and recruit some of the terrorists.
  • Recommending the content goes against the Antiterrorism Act of 1990 because the videos were created by a foreign terrorist organization.

In a big win, the Biden Administration agrees with the plaintiff’s thinking. The White House’s Solicitor General filed an amicus brief in support of the plaintiffs, saying that, per Fast Company’s summary, “Section 230 shouldn’t be interpreted to give platforms like YouTube legal cover for aiding and abetting a foreign adversary like ISIS.”

Recommendation tightrope
But the case takes an even broader view, saying that YouTube acts like a “publisher or speaker” anytime it algorithmically and selectively suggests specific content to specific users. That’s pretty much what social media is built on.

There are big repercussions if that aspect is upheld. If platforms feel like they’re open to lawsuits at all for what their algorithms push, it’s possible they may be more hesitant to host or recommend content that could be considered controversial in any way, shape, or form… even if they’re not necessarily illegal. The headache may not be worth the liability.

Stay relevant

Don’t miss out on the daily email about all things business, entertainment, and culture.
Subscribe