Comisión na Mean, Ireland's media supervisor, has initiated an inquiry into X to determine whether the online messaging platform complies with the Digital Services Act.
Commission na Mean's Investigations Team will check whether X is complying with Article 20 of the DSA, which states, among other things, that users must be able to appeal decisions made by a platform's moderation team.
Investigators will examine whether people can appeal decisions made by X's moderation team, specifically regarding reported posts that are not removed even though the reporting user believes they violate X's policy.
The investigation marks the first formal DSA probe into X by Irish regulators since the platform came under Elon Musk's ownership and underwent significant changes to its content moderation policies and staffing.
What Specific Violations Is Ireland Investigating
The Investigation Team will determine whether users are properly informed of the outcome of a report they submit and of their right to appeal the moderation team's decision.
Investigators are focusing on three key areas of potential non-compliance with Article 20 requirements.
First, they will assess whether X provides users with clear information about appeal mechanisms when content moderation decisions are made, particularly when reported content remains on the platform despite user complaints.
Second, regulators will examine whether X has an internal complaints handling mechanism that is easily accessible and user-friendly.
John Evans, Digital Services Commissioner at Coimisiun na Mean, stated in a press release that following our supervision of X and analysis of information gathered from a variety of sources, there is reason to suspect that X may not be in compliance with their obligations under Article 20.
This investigation will assess whether X has properly informed users of their rights to contest decisions it makes after users report content that they believe violates X's terms of service.
Did you know?
X is designated as a Very Large Online Platform under the DSA with over 45 million active monthly users in the EU, placing it in the same category as Facebook, YouTube, and TikTok, which means it faces the strictest regulatory scrutiny and must comply with enhanced transparency and accountability requirements.
How Does Article 20 of the DSA Protect User Rights
Article 20 of the Digital Services Act establishes users' fundamental rights to challenge content moderation decisions made by online platforms.
The article requires platforms to provide clear, specific statements of the reasons for their moderation decisions, including whether content was removed, access was restricted, or visibility was limited.
Users must receive this information in a timely manner and in a language they can understand, with explanations of the legal or policy basis for each decision.
The regulation also mandates that platforms maintain internal complaint-handling systems that allow users to contest moderation decisions free of charge.
These systems must be easily accessible, user-friendly, and capable of processing complaints promptly and fairly.
Platforms must inform users about the availability of out-of-court dispute settlement mechanisms and the possibility of judicial redress.
Evans emphasized that online platforms are compelled to comply with their obligations under the DSA, adding that if we suspect that any platform is failing in these obligations, we will not hesitate to intervene and, where appropriate, take enforcement action to protect the safety of users in Ireland and across the European Union.
What Penalties Could X Face If Found Guilty
If found guilty of breaching the DSA, X could face a fine of up to 6 percent of its global annual turnover. For a company with X's scale of operations, this could translate into hundreds of millions or potentially billions of euros in penalties, depending on the platform's actual revenue figures.
The substantial potential fine reflects the European Union's commitment to enforcing digital regulations and holding major technology companies accountable for compliance failures.
Beyond financial penalties, X could face additional enforcement measures, including orders to bring practices into compliance within specific timeframes, requirements to implement enhanced monitoring and reporting systems, and potential restrictions on certain features or operations until compliance is achieved.
Repeat violations or failure to comply with remediation orders could result in even more severe sanctions, including daily penalty payments calculated to compel swift compliance.
The investigation also carries reputational risks that could affect X's relationships with advertisers, users, and other stakeholders across the European market.
ALSO READ | Apple Introduces 15% Commission Rate for Mini App Platforms
Why Is Ireland the Lead Regulator for X in Europe
Ireland serves as the lead regulator for X under the DSA because the platform's European headquarters is located in Dublin.
Under EU digital regulations, the member state where a Very Large Online Platform establishes its main presence becomes the coordinating authority responsible for supervising compliance with DSA obligations.
This arrangement creates a single point of regulatory contact for platforms operating across multiple EU countries, streamlining enforcement while ensuring consistent application of rules.
However, this system has faced criticism from consumer advocates and other EU regulators who argue that Ireland has been too lenient with major technology companies headquartered within its borders.
The country hosts European operations for numerous global tech giants, including Meta, Google, Apple, and others, creating concerns about regulatory capture and insufficient enforcement.
Commission na Mean's decision to launch this investigation may signal a more assertive approach to digital platform oversight, particularly as European institutions increase pressure on member states to rigorously enforce DSA provisions.
How Does This Investigation Fit Into Broader EU Tech Oversight
The DSA was adopted in the European Union in 2021 to better protect European consumers and internet users from large and powerful tech companies.
According to the DSA, companies with over 45 million active monthly users and an annual turnover of 7.5 billion euros in the EU are considered Very Large Online Players (VLOPs).
They are required to take appropriate steps to combat illegal content, conduct regular risk assessments, and submit to independent audits of their compliance measures.
This investigation into X represents part of a broader pattern of increased regulatory scrutiny facing major technology platforms across Europe.
The European Commission has launched multiple DSA proceedings against platforms, including TikTok for potential failures to protect minors, Meta for concerns about electoral disinformation, and various companies for inadequate content moderation systems.
The regulatory framework reflects the European determination to establish democratic control over digital spaces and ensure that platforms prioritize user safety and rights over engagement metrics and profit maximization.
As enforcement mechanisms mature and regulators gain experience with DSA tools, technology companies can expect continued pressure to demonstrate compliance with transparency, accountability, and user protection requirements across their European operations.


Comments (0)
Please sign in to leave a comment