TikTok platform published this Wednesday (30) a new company transparency report, with details on how content moderation operated within the social network in the first quarter of 2021.
The platform has been publishing similar documents since 2019, but has recently moved to a biannual rate and to include more information to respond to requests from the community and regulators. The purpose of the report is to show the actions taken by the company to remove posts that violate community usage policies or guidelines.
One of the actions was to remove profiles that may belong to users under the age of 13 — which is illegal and could result in legal consequences for the platform. The main data and numbers are as follows:
61,951,327 posts were removed for violating Community Guidelines or Terms of Service (less than 1% of all videos posted on TikTok.
82% of these videos were removed before they received any views, 91% before any reports, and 93% within 24 hours of being posted.
1,921,900 ads rejected for violating advertising policies and guidelines.
11,149,514 accounts removed for violating Community Guidelines or Terms of Service; of these, 7,263,952 were removed for potentially belonging to a person under 13 years of age. That’s less than 1% of all accounts on TikTok.
71,470,161 accounts prevented from being created by automated means.
The full transparency report related to Q1 2021 can be read in full on this social network page.