Read how Deceptive COVID-19 Vaccine Videos Banned by YouTube

Deceptive COVID-19 Vaccine Videos Banned by YouTube

Visitors have accessed this post 31 times.

YouTube has vowed to remove any false or misleading claim regarding COVID-19 vaccines in its effort to deal with coronavirus misinformation. The video-sharing platform said that it will remove any video that contradicts health authorities like WHO or NHS. This announcement came soon after Facebook stated that it would ban any advertisement that discourages vaccination. The restriction, however, will not apply to comments or unpaid posts.

Impending Vaccine

YouTube had earlier put a restriction on all medically unproven claims regarding the coronavirus. The platform is now clearly expanding its policy to include any content regarding vaccines. In a statement, the Google-owned platform said that since a coronavirus vaccine may well be on its way, they needed to ensure that appropriate policies are set up. These policies will, in turn, be able to remove any misinformation related to the vaccine. It also mentioned in its statement that it will remove any suggestions stating that the vaccine will cause infertility, kill people, or involve microchips that will be implanted in people seeking treatment.

According to YouTube, it has already removed around 200,000 deceptive or dangerous videos related to the virus since February.

Fake Claims

Facebook has also made changes to its policies. New policies have been designed by the social media giant so that it does not face accusations of gaining profits by allowing the spread of anti-vaccination messages. Earlier, Facebook had permitted ads that expressed disapproval of vaccines if the ads in question did not consist of any false claims. It said that the new rules would be imposed in the next few days, however, some ads will run in the meanwhile. It also mentioned that it was starting a campaign so that users were given all information about the flu vaccine.

The company, in a blog, mentioned that their goal was to ensure that messages regarding the safety of vaccines reached significant numbers of people. And any ads with misinformation, which would have a detrimental effect on public health efforts will be prohibited. Anti-vaccination groups will be permitted on the platform. Moreover, any unpaid posts or comments against vaccination will also be permitted. Earlier this year, Jason Hirsch, public policy manager for Facebook, told Reuters that according to the company, users of the platform should have the freedom to express personal views against vaccination. According to Hirsch, aggressive control over content could lead people who are hesitant about vaccines towards the anti-vaccination camps.

The change, however, is one of many others the company has made towards its principles of free speech. Facebook also prohibited posts that denied the Holocaust. It also banned content regarding the QAnon conspiracy theory recently.

Even the UK government faced censure due to the amount of time it has taken to pass laws regarding misinformation on the internet. Experts have raised concerns that in the meanwhile tech companies will have to self-regulate. The task will be tough as the volume of misinformation is far greater than the number of employees keeping a tab. Unsah Malik, a social media advisor, also stated that publishing misinformation should be made unlawful and call for a heavy fine.

Welcome Move

With the increasing number of conspiracy theories regarding the coronavirus vaccine, steps taken by YouTube and Facebook are welcome. What matters is how well these measures are enforced and how effective they turn out to be.