Mother alleges TikTok challenge led to her daughter’s death, was recommended by her algorithm

A young woman is sitting on the sofa, holding a phone with the TikTok logo.

Photo: phBodrova (Shutterstock)

The mother of a 10-year-old girl who died last year is suing TikTok and its parent company ByteDance over allegations that the company’s algorithm promoted a so-called ‘Blackout Challenge’ in the child’s feed.

In a lawsuit filed Thursday, Tawainna Anderson, of Pennsylvania, said her daughter Nylah died last year after suffocating while trying to complete the so-called ‘Blackout Challenge’ which encourages people to register in holding their breath or choking until they pass out, depending on the initial complaint. The mother said she rushed her to a local hospital on December 7, but died on December 12 from her injuries.

Court documents claim the challenge was recommended to him through the algorithm which “determined the deadly Blackout Challenge to be well-suited and likely to be of interest to 10-year-old Nylah Anderson.”

The Blackout Challenge is said to have been taking place on the platform for years, but similar challenges have been part of the school’s playground environment ever since. decades. Yet Tawainna’s death follows a string of similar, high-profile TikTok choking incidents over the past few years. Another 10 year old girl in Italy is dead after attempting the challenge last January, and a 12-year-old Colorado boy is dead in April 2021 after attempting the challenge.

In an emailed statement, a Tiktok spokesperson said, “This disturbing ‘challenge’, which people seem to know about from sources other than TikTok, long predates our platform and was never a trending TikTok. We remain vigilant in our commitment to user safety and will immediately remove related content if found. Our deepest condolences go out to the family for their tragic loss.”

The platform has explicit rules on content that advocates self-harm. The app has a curated version for users under 13 that limit the personal details users can share and limit their ability to comment on or post content, but it is unclear how automatic systems might prevent content from appearing in user flow.

The platform is rated 12+ on the Apple and Google app stores, but like most apps, all you need to create an account is to pretend that you are above the age limit. The company claims this deleted more than 15 million underage accounts last year.

During press conference On Thursday, one of Anderson’s lawyers, Bob Mangeluzzi, said: “TikTok is one of the most powerful and technologically advanced companies in the world, so what did TikTok do once it learned that?. .. [they] used their app and algorithm to deliver a power outage challenge video to a 10-year-old child.

The complaint describes the app’s algorithm intentionally designed to “maximize user engagement and addiction,” which encourages children to engage repeatedly. The lawsuit targets TikTok as the designers of the algorithm as the distributors who promoted the content in Tawainna.

“It’s time for these dangerous challenges to end,” Anderson said at the press conference. “Something has to change, something has to stop because I wouldn’t want another parent to go through what I’m going through.”

This lawsuit isn’t the only one suing TikTok over allegations that they promote content harmful to children. In March, news broke that several attorneys general are investigating whether TikTok is harmful to young adults and whether the company is aware of the content young users see.

TikTok has quickly become one of the most popular social media platforms available, and it is expected to do more in advertising than Twitter or Snap combined.

Leave a Reply

Your email address will not be published.