Is It TikTok’s Fault When Children Die Making Dangerous Videos?
Earlier this month, the respective parents of two girls filed suit against social media platform TikTok and its parent company, ByteDance. According to the suit, the girls died while imitating behavior from videos that TikTok knew were dangerous and should have taken down. It asks for a jury trial and unspecified monetary damages.
One of the largest social media outlets on the planet, TikTok features short videos created and submitted by users. As with many platforms, it uses an algorithm to determine what users see. Based on individual users’ demographics and the videos they engage with, the site curates content to display on their “For You Page” (FYP).
Unique among platforms, TikTok features “challenges,” in which users film themselves doing certain tasks or activities and then encourage others to do the same. Typically they will all use a unique hashtag to make the videos easily cataloged.
In this case, the parents allege that their children, an 8-year-old girl and a 9-year-old girl, each died while taking part in the “blackout challenge” in which participants film themselves holding their breath or asphyxiating until they pass out. In fact, in just over 18 months, at least seven children have died after apparently attempting the challenge.
The story is truly tragic. But it’s not clear that TikTok is uniquely responsible, nor that it should be held legally liable.
The parents are being represented in part by attorneys from the Social Media Victims Law Center (SMVLC), which bills itself as “a legal resource for parents of children harmed by social media addiction and abuse.” The lawsuit accuses TikTok of inadequate parental controls and of doing too little to prevent the proliferation of dangerous content. It alleges, “TikTok actively tries to conceal the dangerous and addictive nature of its product, lulling users and parents into a false sense of security.”
The parents refer to the platform’s algorithm as “dangerously defective,” “direct[ing users] to harmful content” in a “manipulative and coercive manner.” Specifically, they say the algorithm “directed exceedingly and unacceptably dangerous challenges and videos” to each child’s FYP.
Of course, children imitating dangerous behavior did not originate with TikTok. In the early 2000s, after the premiere of MTV’s
Article from Latest