TikTok has added new tools designed to give parents more control over their children's use of the popular video sharing app.
The new tools were announced as several nations moved to ban TikTok on government devices and considered other possible actions.
The governments said they acted because of national security concerns. Several U.S. government agencies have warned that TikTok’s owner, Chinese company ByteDance, could be sharing user data with China’s government. Critics of the app have also said China could use TikTok to spread misinformation.
Legislators in the U.S. and Europe have also raised concerns about TikTok’s content, suggesting it can harm the mental health of young users.
ByteDance has long argued that it does not share data with the Chinese government and has stated that its data is not held in China. The company also disputes accusations that it collects more user data than other social media companies. And it says ByteDance is run independently with no influence from the Chinese government.
More than two-thirds of American teenagers use TikTok. The app is intensely popular because it can influence many areas of popular culture. But many parents have struggled to find a way to effectively limit the amount of time their children spend on the app.
TikTok described the new safety tools in a statement published Wednesday on its website. The statement said the tools aim to assist young users and families in creating positive experiences as “people express themselves, discover ideas and connect.”
The changes include a default setting that limits TikTok usage to one hour each day for users under the age of 18. When the tool goes active “in the coming weeks,” young users of TikTok will receive a message after 60 minutes. They will then be asked to enter a passcode and make an “active decision” to keep watching.
For accounts where the user is under the age of 13, a parent or other responsible adult will have to set or enter an existing passcode to permit an additional 30 minutes of watch time.
TikTok said it decided on the 60-minute limit after receiving advice from child researchers and experts at the Digital Wellness Lab at Boston Children's Hospital.
TikTok and other social media apps have faced criticism for not doing enough to protect young users from inappropriate or harmful content.
A recent report by an anti-hate group suggested algorithms used by TikTok to keep users on the app can have harmful effects. The report, by the nonprofit Center for Countering Digital Hate, said some algorithms suggested videos about self-harm and eating disorders to young users.
TikTok also said Wednesday it will begin sending messages, called notifications, to young users suggesting that they set up a daily usage limit if they decide to opt out of the new 60-minute limit.
In addition, the app will expand offerings designed to give parents detailed information about their children’s overall usage. This tool includes data on how long a user spends on the app, the number of times TikTok was opened and a breakdown of total app usage during the day and night.
One existing TikTok safety tool sets accounts to private by default for those between the ages of 13 and 15. In addition, direct messaging is only available to accounts belonging to users who are 16 or older.
This week, the U.S. and Canada issued orders banning the use of TikTok on government-issued devices. The European Union has also barred use of the app on all employee devices.
Taiwan banned TikTok on government devices in December. And in 2020, India placed a ban on TikTok and a number of other Chinese apps because of privacy and security concerns.
I’m Bryan Lynn.