A judge dismissed a grieving Pennsylvania mother’s lawsuit against the social media app TikTok and its parent company.
Plaintiff Tawainna Anderson blamed TikTok’s algorithm for showing daughter Nylah Anderson, 10, a video of the deadly “Blackout Challenge,” but Judge Paul S. Diamond ruled that Section 230 of the Communications Decency Act shielded the defendants from claims.
“Nylah Anderson’s death was caused by her attempt to take up the ‘Blackout Challenge,'” Judge Paul S. Diamond wrote in a Tuesday dismissal. “Defendants did not create the Challenge; rather, they made it readily available on their site. Defendants’ algorithm was a way to bring the Challenge to the attention of those likely to be most interested in it. In thus promoting the work of others, Defendants published that work—exactly the activity Section 230 shields from liability. The wisdom of conferring such immunity is something properly taken up with Congress, not the courts. I will thus grant Defendants’ Motion on immunity grounds. In light of my decision, I need not address Defendants’ contentions respecting jurisdiction and failure to state a claim.”
According to the complaint, Nylah participated in the “Blackout Challenge” at her home on Dec. 7, 2021. She asphyxiated herself. Her mother found her hanging from a purse strap in a bedroom, Diamond wrote in his ruling. Tawainna rushed her daughter to the hospital, but the child died on Dec. 12 after lingering for days in the pediatric intensive care unit.
“She was a butterfly,” Tawainna told WPVI in a December report. “She was everything. She was a happy child.”
The complaint described the “Blackout Challenge” as some sort of trend that “encourages children to choke themselves until passing out,” made all the more dangerous by an algorithm which introduced that content to Nylah Anderson’s feed.
“TikTok is programming children for the sake of corporate profits and promoting addiction,” plaintiffs wrote.
From the lawsuit:
The viral and deadly TikTok Blackout Challenge was thrust in front of Nylah on her TikTok For You Page (“FYP”) as a result of TikTok’s algorithm which, according to the TikTok Defendants, is “a recommendation system that delivers content to each user that is likely to be of interest to that particular user…each person’s feed is unique and tailored to that specific individual.”
“Make sure you’re checking your kids’ phones,” Tawainna said. “Just pay attention because you never know what you might find in their phones or the things they’re trying that you think 10-year-olds wouldn’t try. They’re trying because they’re kids, and they don’t know no better.”
Defendant attorneys argued that the Pennsylvania court had no jurisdiction over TikTik or its parent-company ByteDance. They also said that Section 230 granted them immunity for content posted by a third party on the platform.
“Defendants have no legal duty of care to protect against third-party depictions of dangerous activity that would give rise to a negligence claim,” the company said in an answer to the plaintiff claim of negligence.
Diamond agreed on the matter of Section 230.
“Because I conclude that Section 230 precludes Anderson’s products liability and negligence claims—on which her wrongful death and survival claims depend—I will grant Defendants’ Motion,” he wrote.
“The Anderson family will continue to fight to make social media safe so that no other child is killed by the reckless behavior of the social media industry,” said plaintiff attorney Jeffrey P. Goodman, Saltz Mongeluzzi & Bendesky P.C. in a statement to Law&Crime on Wednesday. “The federal Communications Decency Act (CDA) was never intended to allow social media companies to send dangerous content to children, and the Andersons will continue advocating for the protection of our children from an industry that exploits youth in the name of profits.”
[Photos of Nylah Anderson via lawsuit, image of Tawainna Anderson via WPVI screengrab]
Have a tip we should know? [email protected]