TikTok Warns About A Suicide Video That Goes Viral


TikTok is trying to remove a suicide video uploaded to the platform as of Sunday. TikTok management, which made statements about the video that a group of users uploaded repeatedly or used sections in their content, announced that the video in question would continue to be removed and user accounts would be closed.

TikTok has been on the agenda for a few days with an interesting video. A suicide video posted on the platform on Sunday went viral within hours. Interestingly, a bunch of users are not trying hard to get this suicide video spread. As such, TikTok management took action on the issue and made important warnings.

TikTok spokesperson Hilary McQuaide stated that the video in question was removed immediately after it was shared, but some users have uploaded this video over and over again. Stating that some users use snippets from the suicide moment when creating content, McQuaide said that the content was carefully examined and such videos would also be removed. Moreover, according to the TikTok spokesperson, users’ accounts are also in danger due to this suicide video.

The suicide footage in question was not actually uploaded directly to TikTok. The suicide incident that took place last week, when a man living in the state of Mississippi in the USA ended his life and transferred those moments live on Facebook Live, somehow moved to TikTok. But interestingly, users paid more attention to this kind of objectionable video than expected and tried to make it viral. To be honest, it should be said that this audience has achieved its purpose.

See Also
TikTok Signed An Agreement That Will Make Its Users Happy

Suicide footage is automatically detected and banned by TikTok’s algorithm

In the statements made by TikTok officials, it was mentioned that the images of the moment of suicide, statements praising the suicide and incidents that encouraged suicide violate the platform rules. Stating that every content uploaded to the platform is controlled by algorithms, the officials say that even if this and such videos are uploaded repeatedly, they will be removed instantly.

“Your accounts can be closed”

TikTok spokesperson Hilary McQuaide said that users who try to share the suicide video, which has recently started to spread, will have their accounts closed immediately. Expressing that most of its users have already reported the accounts that shared this video, McQuaide thanked them and said that they were grateful for their efforts to protect the platform.

By the way, the shared suicide video is not the first. Users prefer Facebook Live for such events. In a study conducted by BuzzFeed News in 2017, at least 45 sensitive cases were detected in Facebook Live, which was launched in 2015, in a short period of two years. These include topics such as suicide, shooting, murder, torture and child abuse. Although Facebook says that it stops such content with its algorithms, it seems that some content could not be blocked.


Please enter your comment!
Please enter your name here