TikTok last week published the Q2 2021 Community Guidelines Enforcement Report, revealing details about content that violates terms of service and accounts removed from the platform. In addition, the social network announced an update to protect users from abusive and toxic behavior in live broadcasts.
Between April and June of this year, more than 81.5 million videos were removed for being against community guidelines or terms of service. According to the company, this number represents less than 1% of all productions uploaded to TikTok. For a comparison, 61.9 million materials were deleted in the first quarter of 2021.
In addition, 93% were removed in less than 24 hours after the original posts, 94.1% before customer reports of the service and 87.5% did not get even 1 view.
The company associates improvements in these control metrics with systems that proactively flag symbols of hate, words and other signs of abuse for review by security teams. As an example, 73.3% of harassment and bullying videos were removed prior to any reporting, compared to 66.2% in the first quarter of this year.
That said, the company said 4.6 million were republished following appeals from their authors, and the social network’s team is working to decrease the number of false positives.
“Harassment as a whole, and hate speech in particular, are highly nuanced, contextual issues that can be difficult to detect and moderate every time correctly,” said Cormac Keenan, Head of Trust and Security at TikTok.