A Kenyan court has ruled that Meta (formerly Facebook) is the primary employer of content moderators who were unlawfully dismissed in March 2023. The court also ordered Meta to reinstate the moderators and pay them back.

According to local media reports, the 184 moderators were employed by Sama, a third-party content moderation company that worked for Meta. In March, Sama fired the moderators without warning or severance pay.

The moderators sued Meta, alleging that the company was their real employer and that Sama was simply a subcontractor.

The court agreed with the moderators, finding that Meta had “direct control” over their work. The court also found that the tech giant had failed to provide the moderators with adequate notice of their dismissal or severance pay.

Judge orders Meta to Refrain From Contract Terminations While Lawsuit Ongoing

According to the reports, Judge Byram Ongaya, in a comprehensive 142-page ruling, ordered that Meta and Sama refrain from terminating the contracts in question while a lawsuit challenging the legality of the dismissals is ongoing.

The judge further issued an interim order stating:

“An interim order is hereby issued that any contracts that were to lapse before the determination of the petition be extended” until the case is settled.”

The ruling is a significant victory for content moderators, who often face difficult and stressful working conditions. It also sends a message to Meta and other tech companies that they cannot simply outsource their responsibility for employee welfare.

Additionally, Judge Ongaya prohibited Majorel, the outsourcing company recently engaged by Facebook and based in Luxembourg, from placing the content moderators on a blacklist that would prevent them from reapplying for similar positions.

majorel

The ruling also mandated that Meta, the parent company of Facebook, Instagram, and WhatsApp, must ensure that adequate medical, psychiatric, and psychological support is provided to the petitioners and other content moderators.

Mercy Mutemi, the attorney representing the petitioners said:

“It was critical that the court has found Facebook is the true employer of its moderators. They are very pleased with the orders. This ruling matters not just for the petitioners but the entire social media and AI industry.”

Meta has come under scrutiny due to allegations regarding the working conditions of content moderators, who claim they are exposed to disturbing and hateful content for extended periods without adequate consideration for their well-being.

The company is currently involved in two other legal cases in Kenya. One case, as reported by Time , was filed by a former employee of Sama, Daniel Motaung from South Africa, who alleged poor working conditions and a lack of mental health support against Sama and Facebook.

The labor relations court in Nairobi confirmed its jurisdiction to hear Motaung’s case in February, but Meta has appealed this decision.

Another complaint has been lodged against Meta in Kenya by a local NGO and two Ethiopian citizens. They accuse the company of failing to take action against online hate speech in Africa, which they argue resulted in the murder of an Ethiopian university professor.

The complainants are calling for the establishment of a $1.6 billion fund to compensate the victims.

Content moderation is a vital but often thankless job. Content moderators are responsible for reviewing user-generated content for harmful or illegal content, such as hate speech, violence, and child sexual abuse material.

This work can be emotionally and psychologically demanding, and content moderators are often paid low wages and have few benefits.

The Financial Times reported that, according to Glassdoor, the typical yearly income for a content moderator in the UK is approximately £25,000.

However, content moderators employed by third-party contractors often receive wages close to the minimum amount and are frequently assigned the most disturbing content. External contractors are hired by social networks such as Meta, TikTok, and YouTube to perform these tasks.

In recent years, there have been a number of reports of content moderators being abused by their employers. In some cases, moderators have been subjected to verbal abuse, threats, and even physical violence. In other cases, moderators have been denied adequate breaks, overtime pay, and other benefits.

The Kenyan court’s ruling is a reminder that content moderators are employees, and they deserve the same rights and protections as any other employee. The ruling is also a warning to tech companies that they cannot simply outsource their responsibility for employee welfare.

What does this mean for Meta?

The Kenyan court’s ruling is a significant setback for Meta. The company will now have to reinstate the 184 moderators and pay them back pay.

Meta will also have to change its policies on content moderation to ensure that its moderators are treated fairly.

The ruling could also have implications for Meta’s other content moderation partners. The social media giant may now be more reluctant to outsource its content moderation work to third-party companies.

This could lead to it hiring more content moderators directly, which could improve the working conditions for these employees.

What's the Best Crypto to Buy Now?

  • B2C Listed the Top Rated Cryptocurrencies for 2023
  • Get Early Access to Presales & Private Sales
  • KYC Verified & Audited, Public Teams
  • Most Voted for Tokens on CoinSniper
  • Upcoming Listings on Exchanges, NFT Drops