Monday, November 24, 2025

Creating liberating content

Introducing deBridge Finance: Bridging...

In the dynamic landscape of decentralized finance (DeFi), innovation is a constant,...

Hyperliquid Airdrop: Everything You...

The Hyperliquid blockchain is redefining the crypto space with its lightning-fast Layer-1 technology,...

Unlock the Power of...

Join ArcInvest Today: Get $250 in Bitcoin and a 30% Deposit Bonus to...

Claim Your Hyperliquid Airdrop...

How to Claim Your Hyperliquid Airdrop: A Step-by-Step Guide to HYPE Tokens The Hyperliquid...
HomeTechnologyTikTok accused of...

TikTok accused of ignoring warnings about eating disorder videos hitting billions of views

Leading charities have accused TikTok of ignoring warnings about a stream of eating disorder content on the social media platform after a report showed it failed to remove the harmful videos.

A coalition of 25 leading organizations, including the NSPCC and the Molly Rose Foundation, have issued a call to TikTok’s chief security officer, Eric Khan, for immediate action “to prevent further tragedy.”

They accused TikTok of “denying the issue, evading responsibility and delaying meaningful action” after expressing serious concern last year that the social media platform was being flooded with videos promoting suicide and eating disorders.

“You have clearly been harmed, but you continue to turn your back on the younger users you claim to be protecting. Your silence speaks volumes,” they wrote.

This comes after the Center to Combat Digital Hate (CCDH) published a study in December alleging that TikTok offered eating disorder-related content to teens within eight minutes of logging into the video-sharing platform.

The report says the Chinese social media platform recommends body and mental health videos to teens every 39 seconds on average and that its algorithm is “more aggressive” towards vulnerable users.

The researchers also found that content about eating disorders eludes moderation easily through the use of encrypted hashtags, in some cases using the name of British singer Ed Sheeran.

The study identified 56 TikTok hashtags that host eating disorder videos on the platform, which had over 13.2 billion views as of November last year.

However, a new analysis by CCDH released today shows that TikTok has only removed seven of those hashtags since concerns were raised three months ago, with 49 remaining active.

It showed that since then, content about eating disorders has grown by billions of views, with videos tagged with the 49 remaining hashtags amassing a total of 14.8 billion views by the end of January – up from 1.6 billion views since November.

All hashtags related to recovery were excluded from the study, with the remaining hashtags linking to videos of visceral images, discussions of body statistics, and users raising questions about calories.

The report also accused TikTok of “failing to adequately protect users attempting to access malicious content.”

It showed that 66 percent of encoded eating disorder hashtags in the US included a health warning linking to advice, a figure that dropped to a “pale” 5 percent in the UK.

TikTok’s own analytics tools also revealed that 91 percent of the views of almost half of the tagged hashtags are from users under 24.

CCDH CEO Imran Ahmed accused TikTok of “putting profit before people.”

“Three months later, despite outrage from parents, politicians and the general public, this content continues to grow and spread unhindered on TikTok,” he said.

“TikTok’s algorithm is the social media counterpart of crack cocaine: it’s sophisticated, addictive, and leaves a trail of damage that manufacturers don’t seem to care about.

“Each image represents a potential victim – someone whose mental health could be harmed by negative body image, someone who could restrict their diet to dangerously low levels… The stakes are too high for TikTok to continue doing nothing. right” than our politicians will sit back and do nothing.”

In a letter to Mr Khan, a coalition of philanthropic and advocacy groups also accused TikTok of ignoring “the detrimental impact of your platform’s content algorithm on the mental health and well-being of children.”

“We believe it is your responsibility to take immediate and decisive action to address this issue,” they added.

The group, which includes US organizations such as the American Psychological Association, called on TikTok to strengthen its content moderation policy to better protect young people with eating disorders from harmful content.

They also called on the social media platform to increase transparency and accountability by “reporting regularly on the steps you are taking to address these issues and the impact of those efforts.”

“We call on TikTok to take immediate and effective action to address this issue and prevent future tragedies,” they wrote.

Molly Russell died after seeing malicious messages on social media posts, the coroner ruled last year (Image: PA)

Ian Russell, the father of Molly Russell, a 14-year-old British schoolgirl who committed suicide after seeing posts of self-harm on Instagram, was one of those who signed the letter.

He told you before I Aggressive algorithms on social media platforms such as Instagram and TikTok “could offer more dangerous content as soon as a vulnerable teenager noticed it,” he said, putting young people’s lives at risk.

The investigation into Molly Russell’s death marked the first time that social media sites were formally blamed for the death of a child. Last September, a coroner ruled that a British schoolgirl had died from “the negative impact of online content” on sites like Instagram and Pinterest.

Nearly a third of British teenagers are believed to use TikTok daily, with Ofcom’s 2021-2022 News Consumption Survey showing that more news comes from the video platform in the UK than from the BBC or ITV.

According to the same Ofcom report, around 16% of 3- and 4-year-olds in the UK regularly watch TikTok content, rising to a third among 5-7 year olds.

Notably, senior Tory politicians have warned that vulnerable adults on the platform are also at risk of serious harm. I that the proposed law to address the problem would not protect them.

The Internet Safety Bill, currently in the House of Lords, provides for fines and possible jail time for tech bosses who fail to protect users under the age of 18 from “legitimate but harmful” material, including disorder-related content. eating behavior.

However, experts have accused the government of arbitrarily setting age limits for sanctions, even though research shows that eating disorders are also common in adulthood.

A recent report from the Genetics of Eating Disorders Initiative (EDGI) found that 53 percent of people with eating disorders began to overeat regularly by age 18, and 58 percent had their first low weight by age 18.

said Baroness Niki Morgan, a former culture secretary and current fellow Conservative. I this week: “Vulnerability doesn’t stop just because you’re 18.

“Why can these powerful platforms show this content to potentially very vulnerable people and not regulate it?”

Tom Quinn, director of external relations for eating disorder charity Beat, said: “The government needs to do more and protect adults with or vulnerable to eating disorders online.

“Eating disorders affect people of all ages, and the government should develop a policy to protect people over 18 so that every person with an eating disorder feels safe online.”

A TikTok spokesperson said: “Many people struggling with an eating disorder or in recovery turn to TikTok for support, and we want to help them do so safely.

“Our Community Guidelines clearly state that we do not tolerate the promotion, normalization or glorification of eating disorders, and we have removed any content mentioned in this report that violates these guidelines.

“We are open to feedback and reviews and look forward to working constructively with partners who have experience in dealing with these complex issues, just as we work with NGOs in the US and UK.”

Source: I News

Get notified whenever we post something new!

Continue reading

The world’s first Artificial Intelligence Law comes into force in the EU: key points and objectives

The new law puts a significant emphasis on transparency. Companies must inform users when they are interacting with an AI system, whether on phone calls or in chats where chatbots interfere. ...

What are the blue screens that appear on Microsoft computers after a crash?

Commonly known as the "screen of death" is exclusive to the Microsoft Windows operating system and appears when the system is unable to recover from an error. ...

Microsoft crashes worldwide, causing problems for many companies

The failure was due to an update problem with an antivirus from the company CrowdStrike. The failure has caused chaos at Aena airports, and multiple delays have been recorded. There are incidents at Osakidetza with online appointments and at...