TikTok is making three changes to its mobile app that will make watching videos safer for its young audience. Last year, the British broadcasting and remote work regulator, the Communications Authority (Ofcom), issued new rules requiring sites such as TikTok to protect children under 18 on their platforms. Despite the fact that there is a need for it in the USA, no official decision has been made. TikTok announced some upcoming changes some time ago, but they are not entirely unique in their approach.
For example, last July, Instagram introduced a confidential content control system that allows users to specify the level of confidential content they want to display in the Review feed. Divided into two levels — Restrict and Restrict even more — the idea is to allow users to choose the privacy level of content they are comfortable watching. Faced with sharp criticism from lawmakers, parents and child protection organizations, TikTok also promised concrete changes in how it handles and promotes content for its young audience in February of this year, and finally revealed the work done.
First in line will be a new behind-the-scenes categorization system that will sort videos based on what TikTok calls “thematic maturity.” The social media platform owned by ByteDance did not share detailed information about the indicators that determine categorization, but only says that it is similar to what the audience is used to seeing in movies, TV shows and games. In the coming weeks, the company will further improve it with an automated system that will block “content with frankly mature topics” for users aged 13 to 17. In addition to tagging videos containing adult topics, the system will also target videos with complex topics that are not suitable for viewers under the age of 18. For example, if a video — even if it is fictional — is deemed too scary, a maturity rating will be attached to it, which will not allow users under the age of 18 to see it.
Three steps to one noble goal
Another significant change that TikTok is adding to its video sharing platform is the ability to filter content based on certain hashtags or words. Just like Twitter and its advanced mute options for problematic words and annoying hashtags, TikTok users can also do the same for videos appearing in “For You” and “Followers” feeds. TikTok says that in the coming weeks, this facility will reach its global audience. This is more than just blocking confidential or initiating content, it will also save users from an avalanche of similar videos that they no longer want to see.
TikTok also copes with the threat of videos dedicated to a sensitive topic that bombard the user’s feed. Soon after, the algorithm detects that the user likes him or interacts with him in any capacity. The company states that it will not recommend a series of videos dedicated to the same topic, especially if it is something specific and delicate, for example, “diet, extreme fitness, sadness and other topics of well-being.” TikTok claims to have received positive results when testing the system among its audience in the United States. Work is also underway to correct any potential algorithm errors that force it to offer users a narrow range of content, blocking them in a bubble of similar ideas, which may prove harmful in the long run. The main goal here is to improve the diversity of content subjects as well as creators.