YouTube will delete any misleading claims and information about coronavirus vaccines as part of a fresh bid to tackle Covid-19 misinformation.
The giant tech company noted that any videos that contradict expert consensus from local health authorities, such as World Health Organization, will be removed.
“A Covid-19 vaccine may be imminent, therefore we’re ensuring we have the right policies in place to be able to remove (related) misinformation,” the Google-owned service said in a statement.
It said it would remove any suggestions that the vaccine would:
- kill people
- cause infertility
- involve microchips being implanted in people who receive the treatment
YouTube had already banned ‘medically misleading and unsubstantiated’ content related to covid-19 from airing on its platform.
However, it has now chosen to widen the scope of the policy to include content relating to vaccines.
YouTube said it had already removed 200,000 dangerous or misleading videos about the virus since February.
The announcement comes days after Facebook said that it would ban ads that discourage people from getting vaccinated.
Facebook’s new policy is designed to stop it facing accusations of profiting from the spread of anti-vaccination messages.
The social network had previously allowed ads to express opposition to vaccines if they did not contain false claims.
It said the new rules would be enforced “over the next few days”, but some ads would still run in the meantime.
It added that it was launching a campaign to provide users information about the flu vaccine, including where to get flu shots in the US.
“Our goal is to help messages about the safety and efficacy of vaccines reach a broad group of people, while prohibiting ads with misinformation that could harm public health efforts,” Facebook said.