Seven French families have sued TikTok, saying that the platform’s harmful content caused teens to kill themselves

Seven families in France are suing TikTok, saying that the site put their teens in touch with harmful material that ended in tragedy.

Seven French families have sued TikTok, saying that the algorithm on the social media site put their teen children at risk of seeing harmful material, which led two of them to kill themselves at the age of 15.

The case, which was brought to the Créteil judicial court, says that TikTok’s algorithm pushed videos that encouraged people to kill themselves, hurt themselves, or have eating problems. Laure Boutron-Marmion, the families’ lawyer, says this is the first case of its kind in Europe that aims to hold TikTok accountable.

“Parents want TikTok’s legal liability to be recognized in court,” Boutron-Marmion told the news station franceinfo. He said that the company should be held responsible for any bad effects that happen because it targets young people with its product. According to the claim, TikTok’s content moderation failed to keep young users safe from harmful content.

A lot of attention has been paid to TikTok and other social media sites for how they police material and how it affects young users. In the US, hundreds of lawsuits have been filed against both TikTok and Meta’s platforms, Facebook and Instagram. The lawsuits say that the algorithms behind these platforms have hurt children’s mental health by making them addicted and badly affecting millions of them.

TikTok did not answer right away when asked for a comment on the case. That being said, the company has said in the past that it puts children’s mental health first. CEO Shou Zi Chew told U.S. lawmakers earlier this year that TikTok has put money into programs to protect young users.

Add a Comment

Your email address will not be published.