close
close

The lawsuit alleges that malicious content on TikTok caused the deaths of two teenagers

The lawsuit alleges that malicious content on TikTok caused the deaths of two teenagers

What just happened? The impact of social media platforms on the mental health of young users is under renewed scrutiny after the French families of seven teenage girls filed a lawsuit against TikTok. They claim the platform exposed their teenage children to harmful content, leading to two of them taking their own lives at the age of 15.

The lawsuit filed against Kretei alleges that TikTok’s algorithm offered teenagers videos that promoted suicide, self-harm and eating disorders.

“The parents want TikTok’s legal responsibility to be recognized in court,” lawyer Laure Boutron-Marmion told franceinfo. “This is a commercial company that offers goods to consumers who, moreover, are minors. Therefore, they must be responsible for product defects.”

In September 2023, the family of 15-year-old Marie filed a criminal case against TikTok after her death, accusing the platform of “inciting suicide”, “failing to help a person in danger” and “promoting and advertising suicide methods”. damage,” he writes Politico. TikTok’s algorithm allegedly trapped Marie in a bubble of toxic content related to the bullying she suffered because of her weight.

TikTok has faced numerous lawsuits in the US over claims that it is harmful to the mental health of young people. In 2022, the families of several children who died while trying to take part in a dangerous TikTok challenge sued company and its parent company ByteDance after the app allegedly recommended “blackout” strangulation videos to minors, all of whom were ten years old or younger.

Last month, a group of 14 state attorneys general submitted lawsuits against TikTok, accusing it of harming children’s mental health and violating consumer protection laws. TikTok is allegedly using manipulative features to keep young users on the platform longer. These include infinite scrolling, autoplay videos, and frequent push notifications.

It’s not just TikTok that remains in the spotlight because of the potential harm it can cause to young people. All social media platforms are subject to the same scrutiny. Last October, the attorneys general of more than 40 US states sued Facebook for harming the mental health of children.

During an online Senate hearing on child safety in January, Meta CEO Mark Zuckerberg apologized to parents in the audience who said Instagram had contributed to their children’s suicide or exploitation.

The impact of social networks on the mental health of not only children, but also adults has led to the chief physician of the United States challenge urged Congress to use cigarette-style labels on these sites and apps that warn users of the potential harm they cause.

Social media companies usually hide behind Section 230 of the Communications Decency Act of 1996, which protects them from liability for content posted by users.

TikTok is still facing a potential ban in the US. Amid national security concerns over China’s ownership, President Joe Biden signed legislation in April requiring ByteDance to divest its US operations by January 19, 2025, or face a nationwide ban.