YouTube takes care not to spread false information about COVID-19, which is the biggest problem of the last period. Recently, Youtube updated its coronavirus misinformation policies by removing misleading content about the epidemic vaccine.
With almost 2 billion monthly visitors, the video sharing platform will now remove content that contradicts the “expert consensus” created by the World Health Organization or local health authorities. Among these prohibitions are claims that the vaccine can leave a person infertile or kill.
YouTube is sensitive to COVID-19 content
YouTube previously only banned content about the wrong coronavirus rules and treatments. However, with the update published recently, it showed that it attaches importance to misleading content related to vaccines. According to the official numbers announced by the company, more than 200,000 misleading content about the coronavirus has been removed from YouTube since February. With this policy change, even the big technology companies on the platform are faced with intense scrutiny.
As researchers compete to work for the virus vaccine, misinformation begins to emerge on many platforms. Earlier this year, YouTube had a hard time blocking videos called Plandemic claiming that COVID-19 was a planned pandemic. In a statement released by Facebook last Tuesday, it announced that it will not allow ads that give people false information about the vaccine and deter them. YouTube, on the other hand, stopped advertising on anti-vaccine videos last year. Any anti-vaccination content is considered harmful by YouTube.
States, on the other hand, have already started to put pressure on big technology companies. Last year, a Democrat representative in California wrote an open letter to Google’s CEO asking to remove misleading content on their platform.