Facebook says it deleted 20 mn posts with Covid-19 misinformation on Instagram
Facebook Inc. said it has removed more than 20 million posts on its main social network and photo-sharing app Instagram for violating rules on Covid-19 misinformation since the beginning of the pandemic.
The Menlo Park, California-based company also said it added information labels to more than 190 million Covid-19-related posts on Facebook that third-party fact-checking partners had rated as false or missing context.
The data, which covers actions taken through June, was released Wednesday as part of Facebook’s quarterly community standards enforcement report.
Facebook is seeking to address criticism that its platforms have been used to spread fear about vaccines and misleading information about the coronavirus. The company implemented new policies against Covid-19 misinformation, including banning repeat offenders who spread falsehoods and directing users to a central Covid-19 information hub.