A San Francisco federal judge overseeing a lawsuit from a woman alleging a sex trafficker used Meta Platforms Inc.‘s Instagram app to sell her for sex was challenged by the scope of the tech industry’s federal immunity law at a Tuesday hearing.
Judge Rita F. Lin said differing legal precedents from appeals courts about the powers of Section 230 of the federal Communications Decency Act left her conflicted about whether that legal shield applies to Meta.
“I’m having a hard time reconciling these cases in some ways,” Lin said of the different rulings.
Section 230, passed by Congress in 1996, immunizes internet platforms from civil lawsuits that stem from actions and content posted by users.
The plaintiff, who goes only by Jane Doe, alleged that Meta has “knowingly created a breeding ground for human trafficking” on Instagram by failing to properly verify the identities of user accounts.
The victim alleged that in 2017 she was contacted on Instagram by another user who groomed her by gaining her trust and then publicly posted photos of her that “obviously advertised Jane Doe for sale for sex,” according to the complaint filed in 2022.
The suit was first brought in Texas state court but later transferred to the US District Court for the Northern District of California.
Meta moved to dismiss the case on Section 230 grounds, arguing that the harm in the case ultimately stems from the trafficker and his communications with the plaintiff, not from any conduct by Meta.
Lin said the Ninth Circuit case Lemmon v. Snap Inc., which found that Section 230 doesn’t apply, “is probably the best case” for the plaintiff. In that case, the parents of two boys who died in a car crash sued Snap over the SnapChat app’s “speed filter,” which shows how quickly a user is moving and allegedly caused the boys to speed while driving.
But Lin also said that a more recent Ninth Circuit ruling, Estate of Bride v. Yolo Technologies Inc., appears to go against the plaintiff’s legal theory. In that case, the family of a child who was cyberbullied and died by suicide sued YOLO, a platform that allows users to send each other anonymous messages. The appeals court found that Section 230 does apply because the family sought to hold YOLO liable for the speech of its users, even if the harm to the child occurred offline.
Meta attorney Kristin Linsley of Gibson Dunn & Crutcher LLP said the ruling in YOLO is should resolve the entire case. A “straightforward” analysis shows that the victim was ultimately harmed by the communication from the trafficker, she said. “This is in line with many other Section 230 precedents that say it’s not about artful pleading,” Linsley said.
The plaintiff’s attorney, Walter Simons of Bracewell LLP, countered that the victim is seeking to hold Meta liable for not properly verifying the identities of its users, which has nothing to do with online content. “Checking someone’s identity is not a traditional function of a publisher,” he said.
Lin said what she found most “challenging” about the plaintiffs argument was determining whether the harm ultimately stems from online content created by users, in which case Section 230 immunity would apply.
The plaintiff’s theory of liability “appears directed at the content that is produced when you have anonymous conduct, much like cyberbullying,” Lin said.
Nominations Open For Legal Leaders Awards 2025, Celebrating Excellence In Istanbul ______________________________________________________________________
[A MUST HAVE] Evidence Act Demystified With Recent And Contemporary Cases And Materials Available now for NGN 35,000 at ASC Publications, 10, Boyle Street, Onikan, Lagos. Beside High Court, TBS. Email publications@ayindesanni.com or WhatsApp +2347056667384. Purchase Link: https://paystack.com/buy/evidence-act-complete-annotation _________________________________________________________________