TikTok is introducing screen time limits to address issues with underage use of the social media app.
Families are struggling to control the time their children spend on the Chinese video-sharing platform.
The changes come at a time when governments around the world are increasingly skeptical about the app’s security.
To whom does this apply
As part of the changes, all user accounts under the age of 18 will have a default daily usage time limit of 60 minutes for the next few weeks.
Cormac Keenan, TikTok’s head of trust and safety, said in a blog post Wednesday that when minors reach the 60-minute limit, they will be prompted to enter an access code and make an “active choice” to continue browsing.
For accounts where the user is under the age of 13, a parent or legal guardian must set or enter an existing passcode to grant an additional 30 minutes of viewing time after the initial 60-minute limit is reached.
TikTok said the 60-minute threshold was reached through consultations with academic research and experts from Boston Children’s Hospital’s Digital Health Lab.
TikTok also said it would ask teens to set a daily screen time limit if they opted out of the default 60-minute screen time.
The company sends weekly inbox notifications to teen accounts summarizing screen time.
Some of TikTok’s existing security features for youth accounts include accounts that are set as private by default for people aged 13 to 15, and direct messages are only available to accounts where the user is 16 or older.
TikTok has announced a number of changes for all users, including the ability to set custom screen time limits for each day of the week and give users the ability to schedule notifications to turn off.
The company is also launching a sleep reminder to help people plan for when they’re offline at night.
For the sleep function, users can set the time, and when the time comes, a pop-up window will remind the user that it’s time to log out.
Concerns about using TikTok
In 2022, children in the UK spent an average of 114 minutes per day on TikTok.
A study released in December by the Center to Combat Digital Hate (CCDH) found that TikTok displayed some youth-related content within eight minutes of entering the video platform.
The report says the Chinese platform, which has over a billion users worldwide, recommends body and mental health videos to teens every 39 seconds on average, and that its algorithm is more “aggressive” towards vulnerable users.
Earlier this year, TikTok changed its community guidelines to toughen up its stance on content related to eating disorders.
This means that videos that promote unhealthy eating or habits are “banned” from the platform and moderators are working to remove them.
However, CCDH found that users easily bypassed the reviews by disguising content as recovery-related and using encrypted hashtags, in some cases taking the name of British singer Ed Sheeran.
Social media executives, including TikTok, have been called to the US Congress to explain how they prevent young users from being harmed.
In addition to the excessive use of the app by minors, there are growing concerns around the app’s security around the world.
The European Parliament, the European Commission and the Council of the EU have banned the installation of TikTok on official devices.
This is followed by similar actions by the US federal government, Congress and more than half of the 50 US states.
Canada has also banned its use on government devices.
Source: I News
With a background in journalism and a passion for technology, I am an experienced writer and editor. As an author at 24 News Reporter, I specialize in writing about the latest news and developments within the tech industry. My work has been featured on various publications including Wired Magazine and Engadget.
