As Twitter Sex Trafficking Case Proceeds, Platforms Face an Impossible Dilemma
A federal lawsuit accusing Twitter of sex trafficking can move forward, says District Judge Joseph C. Spero, in a decision that could portend a dangerous expansion of how courts define “sex trafficking.”
The case is one of the first to invoke the controversial 2018 Fight Online Sex Trafficking Act (FOSTA), which made it a federal crime to host digital content that facilitates prostitution. The legislation also tweaked Section 230 of the Communications Decency Act, which protects digital entities like Twitter from being held liable for everything that users post.
The case—filed in the U.S. District Court for the Northern District of California—was brought by two teenagers and the National Center on Sexual Exploitation (NCOSE), a conservative activist group, formerly known as Morality in Media, that also opposes Netflix, Amazon, and Cosmopolitan magazine. Spero’s August 19 decision hints at what similar sex trafficking claims against social media companies may look like in a world where Section 230 isn’t an obstacle.
It’s a worrying vision.
“We’re starting to see…what the actual impact of [FOSTA] is going to be, and are courts going to interpret it kind of in a more broad or a more narrow manner,” says Caitlin Vogus, deputy director of the Free Expression Project at Center for Democracy & Technology. “Here is an example of a court interpreting it more broadly, and that raises a lot of concerns for the impact that it might have on platforms when they’re making decisions about how to respond to all speech on their platforms going forward.”
Traditionally, the crime of sex trafficking must involve “commercial sex acts”—a.k.a. prostitution—and there must be minors involved or an element of force, threatened force, fraud, or coercion. In short, someone must pay someone else (or give them something of value) in a quid pro quo that involves an attempted or completed nonconsensual sex act.
In the case against Twitter, the plaintiffs suggest that soliciting a sex video from someone under age 18 amounts to sex trafficking. Unwittingly providing a platform for a third party to post or link to that video makes one part of a sex trafficking enterprise, they argue. Thus, Twitter is allegedly guilty of participating in a sex trafficking venture by temporarily and unknowingly hosting links to a pornographic video featuring two teenagers.
Several years ago, the teens—who were 13 or 14 years old at the time—recorded themselves engaging in sexual activity and used Snapchat to share these videos with a third party initially believed to be a peer. The recipient turned out to be an adult, who allegedly blackmailed one of the teens (John Doe #1) into providing additional sexual content. Doe #1 blocked the Snapchat predator, and “communications ceased.”
The perpetrator could have been held individually responsible, since blackmail and soliciting obscenity from minors are both crimes. Instead, NCOSE is going after a bigger, richer, and much more high-profile target—albeit one much less culpable for criminal activity.
At some point, a compilation of the videos was posted elsewhere online. In 2019, Doe #1 discovered that two Twitter accounts had shared links to it. Doe—and then his mom, separately—reported these to Twitter and were told Twitter would look into it. Twitter also advised that they report it to the National Center for Missing and Exploited Children and to law enforcement, their complaint says. Later, Twitter responded to one of Doe’s messages saying “no action will be taken at this time. … If the content is hosted on a third-party website, you’ll need to contact that website’s support team to report it.” Meanwhile, Doe’s mom had reached out to a Department of Homeland Security agent, who reached out to Twitter.
Nine days after Doe #1 first made contact, the tweets were deleted and the accounts that shared them suspended.
About a year later, Doe sued Twitter, accusing it of direct sex trafficking, benefiting from participation in a sex trafficking venture, receipt and distribution of child pornography, negligence, and violation of California’s product liability law.
What happened was clearly wrong, but it’s hard to see how it qualifies as sex trafficking. Yes, minors were involved, but no one paid them for sex, nor did they (or some third-party trafficker) get something of value for sending or posting the videos. But it’s crucial to NCOSE’s case that what happened be labeled as illegal commercial sexual activity and not some other criminal offense—otherwise, FOSTA wouldn’t apply. And if FOSTA doesn’t apply, then Section 230 does.
Section 230 says that for certain liability purposes—civil lawsuits, state (but not federal) criminal charges—computer service providers shouldn’t be treated as the speaker or publisher of user-generated content. If I defame you on Facebook, it’s me, not Facebook, who’s legally culpable. If I meet a Match.com date who assaults me, the assailant is guilty, not Match. And so on. But FOSTA exempted many claims involving illegal commercial sex from this rubric. Now, if someone deemed guilty of sex trafficking is found to have used a platform to meet or market a victim, that platform isn’t automatically shielded from various legal liabilities.
“Due largely to FOSTA, civil sex trafficking claims can now be brought against online platforms that had no direct involvement with the sex trafficking venture or the victims,” says First Amendment attorney Lawrence Walters, head of the Walters Law Group.
Twitter argues that FOSTA’s exception to Section 230 protection was meant to
Article from Latest – Reason.com