Facebook’s content reporting system is granted for Kazakhstan to help eliminate harmful content on Social Media

December 2, 2021
Facebook Content Reporting System Kazakhstan Harmful Content Social Media

Kazakhstan has recently been granted direct access to Facebook’s content reporting system by the social media platform’s parent company, Meta. This movement is part of a joint agreement to remove harmful content found on Facebook, Instagram, and other social media platforms. 

Meta and the Ministry of Information and Social Development of the Republic of Kazakhstan have released a joint statement that infers the agreement being the first one to be executed in Central Asia to help raise the efficiency in countering the circulation of illegal activities content among social media platforms. 

 

The agreement has allowed Kazakhstan to report via the content reporting system that violates the global content policy implemented by the social media giant and local content laws executed in Kazakhstan. 

 

Both parties will also have regular communication, such as having an authorised Facebook representative that will work with Kazakhstan’s Ministry about different policy issues. 

Facebook’s regional public policy director said that their partnership with the Kazakhstan government could also enhance online safety for children within the platform. The content reporting system provided for Kazakhstan will help their government deal with all harmful content more efficiently.  

Furthermore, training will also be offered for Kazakhstan in keeping their cyberspace a safe place. The training was provided toward the ministry’s specialist on the proper ways of using the content reporting system and Facebook’s community standards policies. 

According to a Kazakh government deputy, the agreement between their country and Facebook is a win-win situation for both. Citizens will be given opportunities to protect their rights while companies are given a chance to improve their business. The deputy also added that they are prepared to eliminate harmful content on social media platforms while assuring not to invade user or company interests. 

Frances Haugen, a known Facebook whistleblower, was reported to warn the UK Parliament last week that social media platforms’ usage of opaque algorithms in disseminating harmful content shall be controlled. According to Haugen, the algorithms trigger a rise of violent incidents, such as the attack against the US Capitol Building last January. Haugen’s statement was from an investigation in London regarding the Online Safety Bill draft submitted by the UK government earlier this year. 

The Online Safety Bill propositions the enforcement of security measures for companies to protect users against harmful content online such as hate speech, misinformation, racism, and more. 

About the author

Leave a Reply