EU says TikTok and Meta broke transparency rules under Digital Services Act Commission finds breaches

RUSSELS ( TECHY QUANTUM) — The European Commission said Friday that it has preliminarily found TikTok and Meta, the parent company of Facebook and Instagram, in breach of transparency rules under the European Union’s Digital Services Act (DSA). 

The Commission accused the US tech companies of failing to provide researchers adequate access to public data and of not giving users simple and effective ways to report illegal content or appeal content moderation decisions.

The Digital Services Act, which came into effect in 2023, is part of a broader EU effort to regulate the influence of large tech platforms in Europe. 

It requires platforms with more than 45 million active users in the EU to allow independent researchers to access public data. 

This access is intended to help assess the societal impact of these platforms, including potential effects on physical and mental health.

The DSA also obliges platforms to offer user friendly tools for reporting illegal content and to give users effective avenues to challenge content removal decisions. These measures are designed to enhance accountability and empower users.

According to the Commission, both Meta and TikTok have imposed burdensome procedures that make it difficult for researchers to obtain comprehensive and reliable data. 

“This often leaves researchers with partial datasets, affecting their ability to study issues such as exposure of minors to illegal or harmful content,” the Commission said.

In addition, Meta was found to fall short in providing accessible mechanisms for users to report illegal content on Facebook and Instagram. 

The reporting tools and appeals systems were described as complicated, potentially discouraging users from raising concerns about harmful material.

Henna Virkkunen, European Commissioner for Technology and Democracy, said, “Trust in online platforms is fundamental for our societies. 

Platforms must allow users and researchers to scrutinize their operations. Transparency is not optional it is mandatory under EU law.” Digital policy specialists stressed the importance of these findings. 

Dr. Maria Jensen, a European digital rights advocate, said, “The DSA was designed to ensure accountability. If major platforms are not complying, it undermines efforts to safeguard public interest online.”

Professor John Smith, who studies social media and public health, added, “Reliable access to platform data is crucial for research. Obstructing researchers prevents meaningful studies on the impact of content on vulnerable users.”

Meta stated that it disagrees with the preliminary assessment and is working with the European Commission. “We have implemented updates to our content reporting, appeals processes, and data access tools since the DSA came into effect,” Meta spokesperson Ben Walters said. 

“We believe these changes meet the requirements of EU law.” TikTok has not yet issued a public response to the Commission’s findings.

If the Commission confirms the preliminary findings, both companies could face fines of up to six percent of their global annual revenue under the DSA. 

In addition to financial penalties, the Commission may require the platforms to adjust their data access and content moderation systems to meet EU standards.

The outcome could set a significant precedent for enforcement of digital regulations and influence how other large platforms operate within Europe.

European researchers have welcomed the scrutiny. Sofia Müller, a social media researcher in Berlin, said, “Access to reliable data is critical for understanding how platforms affect mental health. 

This Commission investigation could improve research conditions.” Users also expressed concern. Javier Martinez, a Facebook user in Spain, noted, “It is reassuring to know the EU is checking if platforms make it easy for people to report illegal content. It’s about safety for everyone, especially children.”

The European Commission will now allow Meta and TikTok to respond in writing before issuing a final decision. The ruling is expected to clarify the level of transparency and user accountability required under the DSA.

Legal analysts say the case could influence how digital services are regulated not only in Europe but potentially globally, as other regions observe how the EU enforces its tech laws.

The European Commission’s preliminary findings underscore ongoing concerns about transparency, accountability, and user protection on major social media platforms. 

As the investigation continues, the enforcement of the Digital Services Act could reshape how tech giants handle data access, content moderation, and user rights across the European Union.

Leave a Comment