Monday, November 24, 2025

Creating liberating content

Introducing deBridge Finance: Bridging...

In the dynamic landscape of decentralized finance (DeFi), innovation is a constant,...

Hyperliquid Airdrop: Everything You...

The Hyperliquid blockchain is redefining the crypto space with its lightning-fast Layer-1 technology,...

Unlock the Power of...

Join ArcInvest Today: Get $250 in Bitcoin and a 30% Deposit Bonus to...

Claim Your Hyperliquid Airdrop...

How to Claim Your Hyperliquid Airdrop: A Step-by-Step Guide to HYPE Tokens The Hyperliquid...
HomeTechnologyTikTok recommends eating...

TikTok recommends eating disorder videos to teens within minutes of signing up, reports results

TikTok shows content about teen eating disorders within eight minutes of entering a video-sharing site, according to a new report.

A study released Thursday by the Center for Countering Digital Hate (CCDH) found that TikTok recommends body and mental health videos to teens every 39 seconds on average and that its algorithm is “more aggressive” towards vulnerable users.

It was claimed that users who took a short break or liked body image and mental health videos received recommended eating disorder videos within eight minutes of entering the platform. They were also shown suicide-related content for 2.6 minutes.

The nonprofit researchers used a method similar to Instagram’s when conducting an internal audit of its algorithms in 2019. Leaked presentation slides from the audit claimed Facebook’s parent company knew the social networking site was exacerbating body image issues for a year. one third of teenage girls.

CCDH has launched eight new TikTok accounts in the US, UK, Canada and Australia that claim to be 13 years old, the minimum age allowed by most social media sites.

Half of these were “standard” accounts, while the other half were considered “vulnerable” because their usernames contained phrases such as “lose weight” that indicated pre-existing vulnerabilities.

The decision was based on the technology reform initiative’s Reset study, which found that users with eating disorders often choose usernames with similar words on Instagram.

Mobile phone records of Molly Russell, a 14-year-old British schoolgirl who committed suicide after seeing posts of self-harm on Instagram, showed her social media username indicated underlying mental health issues.

CCDH said “vulnerable” users would be quickly identified and “capitalized” by TikTok. The study found that they were recommended about 12 times more self-harm and suicide videos than standard accounts, including content featuring razor blades and scissors.

The researchers also found that content about eating disorders eludes moderation easily through the use of encrypted hashtags, in some cases using the name of British singer Ed Sheeran.

56 TikTok hashtags have been discovered that feature eating disorder videos, totaling over 13.2 billion views.

The report claimed that TikTok did not moderate or tag videos that appeared to promote self-harm and body image issues, and that “health-promoting content is freely mixed with content that promotes eating disorders.”

CCDH CEO Imran Ahmed shared I that “Young people’s channels are being bombarded with harmful and poignant content” on the Chinese social media platform. He added that TikTok “recognizes the vulnerability and is responding to it by increasing the amount of malicious content sent to that person.”

Almost a third of British teenagers are believed to use TikTok daily. Ofcom’s 2021-22 News Consumption Survey shows that more UK teenagers are getting their news from the video platform than from the BBC or ITV.

Meanwhile, around 16% of three- and four-year-olds in the UK regularly watch TikTok content, according to the same Ofcom report, with that figure rising to a third of children aged five to seven.

Ian Russell, father of Molly Russell, said the findings of the CCDH report were “shocking but not surprising”.

“These platforms may offer more dangerous content as soon as a vulnerable teenager sees it,” he said. I.

“This is something that social media platforms are very good at saying: they take security very seriously.” But again and again…we find that some of the world’s largest commercial companies are not investing enough in the safety of their users, especially young and vulnerable teenagers who are at high risk of viewing harmful content.”

He urged the government to tighten internet safety law after ministers were accused last month of softening proposed legislation by repealing an amendment that would force social networking sites to remove all “legitimate but harmful” material.

Mr Russell said I: “I think we’re driving the information superhighway in a car designed to keep us driving, but no one has thought of seat belts, airbags, speed bumps, and speed bumps.

“Safety was an afterthought and the only way to make a difference was to put in place effective rules.”

Tom Quinn, director of eating disorder charity Beat, said: “It is extremely disturbing that TikTok is allowing teens to view harmful eating disorder videos, with some users viewing eating disorder content as little as eight minutes before viewing the website. disorders. Platform.

“TikTok and other social media platforms must urgently take action to protect vulnerable users from malicious content. This includes working closely with eating disorder specialists, making their algorithms more accountable and transparent, and providing quality user support.

A TikTok spokesperson said: “We regularly consult with healthcare professionals, address violations of our policies, and provide access to support resources to anyone who needs them.

“Realizing that triggered content is unique to each individual, we remain focused on creating a safe and comfortable space for everyone, including those who choose to share their recovery experience or others, to clarify these important questions.”

Source: I News

Get notified whenever we post something new!

Continue reading

The world’s first Artificial Intelligence Law comes into force in the EU: key points and objectives

The new law puts a significant emphasis on transparency. Companies must inform users when they are interacting with an AI system, whether on phone calls or in chats where chatbots interfere. ...

What are the blue screens that appear on Microsoft computers after a crash?

Commonly known as the "screen of death" is exclusive to the Microsoft Windows operating system and appears when the system is unable to recover from an error. ...

Microsoft crashes worldwide, causing problems for many companies

The failure was due to an update problem with an antivirus from the company CrowdStrike. The failure has caused chaos at Aena airports, and multiple delays have been recorded. There are incidents at Osakidetza with online appointments and at...