Even a social network for doctors is struggling with vaccine misinformation
Facebook is finally banning vaccine misinformation
The company had previously made the content less visible.
Facebook is finally taking a tougher stance on vaccine misinformation. With help from the World Health Organization (WHO), the company has expanded the number of claims it will remove from its platform. You can see the full list on its Help Center, but some of the more notable ones include claims that suggest COVID-19 is man-made or that it’s safer to get sick from the disease than it is to get vaccinated against it. In a major step forward, the company says it will also remove claims that vaccines are toxic or that they can cause autism.
Facebook’s enforcement actions will initially focus on Pages, groups and accounts that violate its new rules. The company says it will remove repeat violators. Meanwhile, moderators of groups that have violated Facebook’s policies against COVID-19 and vaccine misinformation in the past will have to approve every post on their page. As an added protection, third-party fact-checkers can still scrutinize claims that don’t outright violate the company’s policies on COVID-19 or vaccines. If they’re determined to be false or misleading, Facebook says it will label and demote those posts.
“These new policies will help us continue to take aggressive action against misinformation about COVID-19 and vaccines,” the company said.
Besides taking a stricter stance on vaccine misinformation, Facebook says it will simultaneously take additional steps to get the right information to people. Like Google, the company will help find out how they can get vaccinated. Starting this week, Facebook’s COVID-19 Information Center will include links to local health authorities that have details on their websites about who can get a vaccine at the moment, as well as how to go about getting one. As that information becomes more widely available elsewhere, Facebook will share that information in other countries, in addition to making the Information Center available through Instagram.
To complement those efforts, the company will donate $120 million in ad credits to public health agencies, NGOs and the UN, as well as provide those organizations with training and support as they work to get authoritative information out to people.
Historically, Facebook’s efforts to curb vaccine misinformation on its platforms have been mostly ineffective due to the fact the company stopped short of banning that type of content altogether. Even with Facebook pointing people toward authoritative sources, accounts promoting conspiracy theories and inaccurate information have dominated its search results page.
(39)