Facebook has tightened the noose on misinformation in Sri Lanka which includes bringing on dedicated Sri Lankan policy and programs managers to engage with local stakeholders.
The Facebook Sri Lanka Team said that Facebook has deployed Sinhala-language hate speech technology that helps them proactively detect potentially violating content.
Facebook has also introduced changes to its platform architecture in the aftermath of crises like the Sri Lanka Easter bombings that limit virality by restricting the number of times people can reshare content.
The Facebook Sri Lanka Team said that they rely on a mix of user reports and their own technology to detect content that violates Facebook Community Standards, including its hate speech policy.
Once content is reported to Facebbok as potentially violating, it is reviewed by its global team of content moderators.
Facebook said that their expanded global and regional team of more than 35,000 people working on safety and security – three times the amount they had in 2017 – includes 15,000 content moderators who cover more than 50 languages.
For Sri Lanka specifically, Facebbok has native-speaking content reviewers who support the majority of official languages in Sri Lanka, including Sinhala and Tamil.
Facebook believes that for Sri Lanka they have the right number of moderators with the right language abilities in place.
Even during the coronavirus crisis, Facebook says they took aggressive steps to stop misinformation and harmful content from spreading, including by removing misinformation that could lead to real harm such as false cures and prevention like claims the eating garlic cures the virus. (Colombo Gazette)