TikTok, one of the popular social media platforms, said it removed 7.3 million accounts suspected of being underage. Social media platform TikTok said it removed nearly 7.3 million accounts suspected of belonging to underage children in the first quarter of this year. The platform said the profiles only accounted for 1 percent of its global users.
TikTok, which is very popular among the young audience, generally allows the use of children aged 13 and over. With the ban in question, the platform published such figures for the first time in its Community Rules Enforcement Report.
TikTok promised more transparency
It is hoped that details on underage users will help the industry move forward when it comes to transparency and accountability for user safety, the report said.
The report also underlined that more than 61 million videos were removed for violating the rules of the app, more than 1 million ads were rejected for violating its policies and guidelines, and in total, more than 11 million accounts were removed due to terms of service violations.
“To give more visibility to the actions we take to protect minors, we’ve added the number of accounts that were removed because they were potentially owned by a minor,” said Cormac Keenan, the company’s head of security.
TikTok emphasized that it will take various measures to protect young people on the platform, including limiting features such as private messaging and live streaming to users aged 16 and over. On the other hand, in a statement made in January, it was said that the accounts of those under the age of 16 will be automatically hidden.
In January, the Italian data privacy watchdog ordered TikTok to block minor accounts after a 10-year-old girl died after trying a viral challenge on the app, saying she could face bans and penalties otherwise.