European Union Launches In-Depth Investigation into Elon Musk's X Under Digital Services Act
The European Commission, responsible for enforcing the DSA on major platforms like X, announced the initiation of the investigation, citing potential breaches related to risk management, content moderation, dark patterns, advertising transparency, and data access for researchers.
In a significant move, the European Union (EU) has initiated a formal investigation into Elon Musk's digital empire, represented by X, formerly Twitter, under the recently revamped Digital Services Act (DSA).
The European Commission, responsible for enforcing the DSA on large platforms, declared the commencement of a "formal proceeding" today. The investigation aims to scrutinise potential breaches related to risk management, content moderation, dark patterns, advertising transparency, and data access for researchers.
The EU's decision to open a formal DSA investigation on X comes in the wake of a complaint filed against X's advertising technology by the privacy rights group, noyb. While these events may seem connected, the Commission has been actively probing the platform for several months, primarily focusing on concerns related to the spread of illegal content and disinformation during the Israel-Hamas conflict.
As early as October, the Commission issued an "urgent" formal request to X, seeking information on how the company was addressing information risks arising from the conflict. The EU expressed concerns about X's compliance across various areas, including policies on illegal content notices, complaint handling, risk assessment, and measures to mitigate identified risks. X was given until the end of October to respond to these inquiries.
Based on the preliminary investigation, which included an analysis of X's risk assessment report, Transparency report and responses to formal requests for information, the Commission decided to open formal infringement proceedings against X under the Digital Services Act.
The European Union's investigation into X encompasses various critical areas and concerns. Firstly, it centres on examining X's compliance with the obligations outlined in the Digital Services Act (DSA). This scrutiny specifically targets X's efforts in countering the dissemination of illegal content within the EU.
Evaluators will closely analyse the effectiveness of X's risk assessment and mitigation strategies, along with assessing the functionality of their notice and action mechanism concerning illegal content removal.
Additionally, the investigation delves into the effectiveness of X's measures aimed at combating information manipulation. This aspect focuses notably on the evaluation of the 'Community Notes' system designed for fact-checking purposes. Moreover, it aims to scrutinise related policies that mitigate risks to civic discourse and electoral processes within the platform.
Another significant area of investigation revolves around transparency requirements mandated by the DSA. Authorities are looking into suspected deficiencies in X's provision of access to publicly available data for researchers, as stipulated by Article 40 of the Act.
Furthermore, this inquiry extends to suspected shortcomings within X's ads transparency library, which is expected to provide clear and accessible information about advertisements on the platform.
Lastly, the investigation includes a careful examination of potential deceptive design elements present in X's user interface. Specifically, evaluators are scrutinizing features such as checkmarks associated with certain subscription products, commonly known as "Blue checks," to ascertain if these elements may be misleading or deceptive in their presentation or functionality.
The EU highlighted that if proven, these failures would constitute infringements of specific articles of the DSA. An in-depth investigation will be conducted as a matter of priority, and the Commission will assess potential sanctions, including fines of up to six per cent of X's global annual turnover for confirmed breaches.
X, designated as a very large online platform (VLOP) under the DSA, could face real-world implications for its operations sooner rather than later. The EU has the authority to apply interim measures and seek temporary blocking of infringing services if there is a perceived risk of serious harm to users.
Joe Benarroch, a representative of X, responded to the EU probe, stating that the company remains committed to complying with the Digital Services Act and is cooperating with the regulatory process.
Benarroch emphasised the importance of a process free from political influence and in adherence to the law, stating that X is focused on creating a safe and inclusive environment for all users while protecting freedom of expression.
While X, under Elon Musk's leadership, has demonstrated a deviation from responsible governance envisioned by the DSA, the company recently introduced a research program to gather data on systemic risks within the EU. However, the Commission remains sceptical about X's transparency efforts, questioning whether the company has met the DSA's standards.
The investigation poses several intriguing questions, including the EU's stance on Musk's approach of replacing formal content moderation with crowdsourced opinions. The impact of Musk's restructuring of content moderation teams and processes on user exposure to harm will also be scrutinised.
© Copyright IBTimes 2024. All rights reserved.