Third Circuit ruling is a big threat to the Section 230 Big Tech legal protection. Good to see it happening.
Excerpts:
TikTok, Inc., via its algorithm, recommended and promoted videos posted by third parties to ten-year-old Nylah Anderson on her uniquely curated “For You Page.” One video depicted the “Blackout Challenge,” which encourages viewers to record themselves engaging in acts of self-asphyxiation. After watching the video, Nylah attempted the conduct depicted in the challenge and unintentionally hanged herself. Nylah’s mother, Tawainna Anderson, sued TikTok and its corporate relative ByteDance.
Third Circuit decision:
Because TikTok’s “algorithm curates and recommends a tailored compilation of videos for a user’s FYP based on a variety of factors, including the user’s age and other demographics, online interactions, and other metadata,” it becomes TikTok’s own speech. And now TikTok has to answer for it in court. Basically, the court ruled that when a company is choosing what to show kids and elderly parents, and seeks to keep them addicted to sell more ads, they can’t pretend it’s everyone else’s fault when the inevitable horrible thing happens.
One of the three judges invoked common law in the ruling:
Instead of arguing that TikTok was the publisher of the challenge videos, Matey made the point that TikTok was its distributor. Drawing on common carriage traditions, he pointed to an older pre-internet legal distinction between “publisher” liability for speech, which is to say that author or publisher’s responsibility, and “distributor” liability for speech, which is to say the bookstore, library, or newsstand’s role.
Linked article includes speculation on how SCOTUS judges may vote. Could see strange bedfellows on this one.
Click here to read the details.
12:02 pm on August 29, 2024