Elon Musk’s X may be in trouble with the European Union following the conflict in Israel and Gaza. The EU Commission warned the platform Wednesday that it was harboring illegal content and disinformation surrounding the Israel-Hamas war.

X’s CEO, Linda Yaccarino, responded by detailing the platform’s efforts at a high level, saying that it was making a great effort by dedicating special content moderation teams to the problem and removing “tens of thousands” of posts.

The EU Commission apparently wasn’t satisfied with her reply and formally requested X to provide more information under the new Digital Services Act (DSA) late Thursday. This may turn out to be the very beginning of a full investigation into the company and its content moderation practices.

The new law came into effect on August 25th of this year, designating a handful of the largest online platforms in the world as “very large online platforms” or VLOPs. The DSA is meant to ensure that these platforms are keeping their users (and the world) safe by providing strict content moderation guidelines.

For years and years, little regulation has been in place to make sure that social media sites and other large platforms are making a reasonable effort to keep their users safe with strong content moderation systems. These platforms have had some content laws to follow by deleting obviously illegal content but the DSA widens the definition of illegal content that needs to be moderated, including disinformation.

The DSA includes other provisions such as banning targeted advertising based on a user’s sexual orientation, political beliefs, ethnicity, or religion. It also greatly restricts targeted advertising to children.

In this case, however, the main focus of the EU Commission is disinformation and other forms of illegal content on X.

Yaccarino was right to take fast action to solve the disinformation problem and respond to the Commission as non-compliance with the DSA could be absolutely devastating. It could trigger fines of up to 6% of the company’s total turnover. If the content issues continue, the EU could even block the service from the region entirely.

What Illegal Content Is Spreading on X?

The Israel-Palestine situation is uniquely complicated. Just about everyone in the world has an opinion on the decades-long conflict and many of them are incredibly passionate one way or the other.

This isn’t surprising as the conflict has claimed the lives of thousands of people, many of whom were innocent civilians. This war is also set to be the most deadly of any engagement between Israel and Hamas by a large margin.

All of these factors (and more) help explain why there is so much disinformation and illegal content being spread on X and elsewhere. The DSA specifically requires VLOPs to respond effectively to reports of illegal content and to remove disinformation, gender-based violence, or any negative effects on the exercise of fundamental rights.

Disinformation is shockingly common as users label years-old videos as new events. One of the most widespread examples of disinformation was a clip of a helicopter being shot down that turned out to be from the video game Arma 3.

The Commission found that a large amount of hate speech as well as terrorist content and violence was being spread on X. This conflict is a nightmare for any content moderation team, let alone X’s after much of it was laid off following Musk’s acquisition of the company.

Hamas itself is considered a terrorist organization in the EU, US, and most Western countries meaning that any posts supporting it could potentially be illegal terrorist content. Furthermore, the fact that the conflict is so rooted in religion and ethnicity combined with the passion on both sides means that hate speech is spreading like wildfire.

The disinformation problem is compounded by the fact that there is a limited media presence in the region (and especially in Gaza where there is currently no power). Despite this, X reported that there were over 50 million posts that referenced the war in only 2 days.

This figure makes Yaccarino’s statement that they had already removed 10s of thousands of posts seem almost pitiful. Admittedly, moderating and removing every single X post out of those 50 million (and the many more following them) that includes hate speech, violence, or support of Hamas is a Herculean if not a Sisyphean task.

Nevertheless, if X fails to remove a majority of the illegal posts streaming through its site it may be punished dearly.