Facebook assesses country risks for decisions on content removal
Facebook Inc. said it’s developed a strategy since 2018 to monitor and remove content that violates its policies, especially in countries most at risk of offline violence.
The factors used for such evaluations include social tensions and civic participation, as well as how the use of its social media tools affect that country, it said, citing elections in Myanmar, Ethiopia, India and Mexico as recent examples. It also considers how the information may shed light on a current problem, such as crime, elections, violence and Covid-19 transmission and vaccination rates, it added.
“This allows us to act quickly to remove content that violates our policies and take other protective measures,” according to a Facebook blog Saturday by Miranda Sissons, director of human rights policy, and Nicole Isaac, international strategic response director.