The European Union is seeking information from social media platform X about cuts to its content management resources as part of its first major investigation into the company under tough new laws governing online content.
The EU’s executive body, the European Commission, said in a statement on Wednesday that the landmark technology law, Digital The company announced that it is requesting information from Company X based on the Services Act.
The commission said it was concerned about X’s transparency report submitted to regulators in March 2024, which noted that the moderator’s transparency report in early October 2023 It said the results showed it had reduced its team of content moderators by nearly 20%.
The European Commission again cited X’s transparency report and stated that X had reduced its language coverage within the EU from 11 to 7 languages.
The commission said it is seeking further details from X on risk assessments and mitigation measures related to the impact of generative artificial intelligence on electoral processes, the dissemination of illegal material, and the protection of fundamental rights.
Company X, formerly known as Twitter, did not immediately respond to a request for comment from CNBC.
According to the European Commission, X must provide the EU-requested information on content moderation resources and generative AI by May 17th. The agency said the remaining responses to the committee’s questions must be submitted by May 27 at the latest.
The European Commission said the request for information was a further step in its formal investigation into violations of the EU’s recently introduced digital services law.
The commission launched formal infringement proceedings against X in December after concerns were raised about its efforts to address illegal content surrounding the Israel-Hamas war.
At the time, the European Commission focused on Company X’s compliance with its obligations to combat the spread of illegal content within the EU, the effectiveness of social media platforms’ measures to combat information manipulation, and measures to increase transparency. He said he would investigate the matter.
EU officials said the information request was aimed at building on evidence collected so far regarding the DSA investigation into X. That evidence includes X’s March transparency report and responses to previous requests for information about what X is doing to address disinformation risks. Related to AI generation risks.
The DSA, which only came into force in November 2022, requires large online platforms like X to reduce the risk of disinformation and remove hate speech, while balancing concerns for freedom of expression. Requires strict procedures to be in place.
Companies found to have violated the rules could be fined as much as 6% of their global annual revenue.