Meta Called to Boost Africa Content Moderation

FILE: Logo of Meta Platforms is seen in Davos, Switzerland. Taken May 22, 2022

Rights groups are calling on Meta Platforms to seize the opportunity to improve its content moderation in Africa after its main third-party contractor in the region said it would no longer screen harmful posts for the social media giant.

Kenya-based outsourcing firm Sama said on Jan. 10 it would no longer provide content moderation services for the owner of Facebook, WhatsApp and Instagram in March as it moves to concentrate on data labelling work.

The announcement comes as both Sama and Meta face a lawsuit over alleged labor abuses and preventing workers from unionizing in Kenya. Another lawsuit accuses Meta of allowing violent posts to flourish on Facebook, inflaming civil conflict in neighboring Ethiopia. Both companies have defended their record.

Meta said it has strict rules outlining what is and is not allowed on Facebook and Instagram.

"Hate speech and incitement to violence are against these rules and we invest heavily in teams and technology to help us find and remove this content," the Meta spokesperson said.

"Our safety and integrity work in Ethiopia is guided by feedback from local civil society organizations and international institutions."

Digital rights campaigners said efforts made by Meta to curb harmful content in African countries were woefully inadequate compared to richer nations, and called on the company to drastically improve its moderation processes.

"With the exit of Sama, now would be a good chance for Meta to put things right and ensure better labor conditions for African moderators in the region," said Bridget Andere, Africa policy analyst at Access Now.

"Meta should increase the number of moderators for the region to adequately cover local languages and dialects, and also be more transparent about their algorithms which are promoting harmful content," she told the Thomson Reuters Foundation.

Meta did not provide any details on whether it had found a new third-party contractor for East Africa, but said Sama's withdrawal would not adversely impact users on its social media platforms.

"We respect Sama's decision to exit the content review services it provides to social media platforms," a Meta spokesperson said.

"We'll work with our partners during this transition to ensure there's no impact on our ability to review content."

Sama said it would be laying off 3% of its staff - about 200 employees - to streamline its operations and boost efficiency. It will continue to provide data labelling services to Meta.

Last month, Meta was landed with another lawsuit that accuses the company of allowing violent posts to flourish on Facebook, inflaming Ethiopia's civil war.

The lawsuit, filed by two Ethiopian researchers and Kenya's Katiba Institute rights group, argues that Facebook's recommendations systems amplified hateful and violent posts in Ethiopia, including several that preceded the murder of the father of one of the researchers.

The plaintiffs are demanding that Meta take emergency steps to demote violent content, increase moderation staffing in Nairobi, and create restitution funds of about $2 billion for global victims of violence incited on Facebook.