News and HeadlinesIndiana Coronavirus News

Actions

Facebook to warn users who interacted with posts containing misinformation about COVID-19

Facebook to warn users who interacted with posts containing misinformation about COVID-19
Posted

MENLO PARK, Calif. – Facebook announced Thursday that it’s taking additional steps to combat misinformation on its website regarding the COVID-19 pandemic.

The social media giant says it’s going to start warning people who have liked, reacted or commented on harmful misinformation that Facebook has chosen to remove.

Those messages will connect users to a World Health Organization (WHO) page that lists debunked coronavirus myths, including ones removed from Facebook for leading to imminent physical harm.

“We want to connect people who may have interacted with harmful misinformation about the virus with the truth from authoritative sources in case they see or hear these claims again off of Facebook,” wrote Guy Rosen, the platform’s VP of Integrity.

Facebook says these messages will start going out in the coming weeks.

Additionally, Facebook says it recently added a new section to its COVID-19 Information Center called “Get the Facts.” The company says it wants to make it easier for people to find accurate information about COVID-19.

The section includes fact-checked articles from the platform's partners that debunk misinformation about the virus. The stories are selected by a curation team and updated every week.

These additional steps come after Facebook announced that it would be using fact-checkers to reduce the spread of false information on the site. Once a piece of content is rated false, Facebook reduces its distribution and shows warning labels with more context.

Based on one fact-check, Facebook says it’s able to kick off similarity detection methods that identify duplicates of debunked stories. For example, in March, the site displayed warnings on about 40 million posts related to COVID-19 on Facebook, based on around 4,000 articles by its independent fact-checking partners. When people saw those warning labels, 95% of the time they did not go on to view the original content, according to Facebook.

To date, Facebook says it has also removed hundreds of thousands of pieces of misinformation that could lead to imminent physical harm. Examples of misinformation removed include harmful claims like drinking bleach cures the virus and theories like physical distancing is ineffective in preventing the disease from spreading.

“As this pandemic evolves, we’ll continue focusing on the most effective ways to keep misinformation and dangerous hoaxes about COVID-19 off our apps and ensure people have credible information from health experts to stay safe and informed,” wrote Rosen.