
How the European Commission Seriously Targeted Social Media Giants for the First Time: What the Claims Against Meta and TikTok Mean and Why This Changes the Rules for All of Europe
On October 24, the European Commission published preliminary findings that Meta (Facebook, Instagram) and TikTokviolated key transparency obligations under the Digital Services Act (DSA). The platforms failed to provide researchers with adequate access to public data, and Meta also did not create simple and effective mechanisms for reporting illegal content and appealing moderation decisions. The Commission found: “Facebook, Instagram, and TikTok complicate the process of requesting access to public data for researchers, leaving them with partial or unreliable information.” This is not just a bureaucratic formality: researchers help society and state agencies understand how much users especially minors are exposed to illegal or harmful content. Providing such access is a mandatory transparency requirement for large online platforms.
A separate claim concerns the fact that Meta uses “dark patterns” (deceptive design that complicates the process for users to report illegal content). This refers to interface elements that deliberately mislead, making ways to complain particularly about child sexual abuse or terrorist content non-obvious or difficult to find. Users have the right to appeal platform decisions on content removal or account blocking, but the actual mechanisms in Facebook and Instagram often do not allow explanations or supporting documents to be submitted, which significantly limits this right.
Fines That Could Impact the Business Model
According to the DSA, for violations of transparency requirements, companies may face fines of up to 6% of their global annual turnover.
The platforms now have the opportunity to review the investigation documents and provide a written response to the preliminary findings. The final decision has not yet been made, but the situation is already a signal to the entire industry: even giants must change their approach to transparency and user rights protection in the EU.
Quote from a European Commission representative: “Mechanisms for reporting illegal content must be simple and convenient for users, and the appeal process must be transparent and effective.”
Access to Data for Researchers: Why This Is Strategically Important
On October 29, a new delegated act comes into force, ensuring researchers’ access to non-public data of very large platforms and search engines. This is intended to increase the transparency of platforms’ work and allow independent assessment of how algorithms may spread dangerous content or create risks for democracy and public health.
The European Commission has made it clear: “Providing access to public and, under certain conditions, non-public data is a fundamental transparency requirement, without which it is impossible to assess the real risks to users.”
Meta and TikTok’s Court Victory: The Nuances Do Not Change the Essence
In October, Meta and TikTok won a court case against the European Commission regarding the methodology for calculating the DSA supervisory fee (0.05% of the company’s annual income).
The General Court in Luxembourg ruled that the procedure should have been adopted via a delegated act, not an implementing decision.
Regulators have been given 12 months to correct the procedures. The amounts paid for 2023 are not currently being refunded.
The European Commission emphasized: this is a formal correction that does not affect the idea or size of the fee itself and does not provide grounds to avoid further obligations.
Post List
Why the DSA Is More Than Just Regulation
The Digital Services Act (DSA), in effect since November 2022, is the EU’s attempt to establish new rules for the largest platforms: Amazon, Apple, Google, Microsoft, Meta, TikTok, X (Twitter), and others.
It requires:
- algorithmic transparency,
- effective moderation of illegal or harmful content,
- real data access for researchers,
- the possibility to appeal platform decisions.
Violation of these principles brings not only financial liability but also jeopardizes the companies’ access to the European market.
The claims against Meta and TikTok are not an isolated conflict, but the first precedent of real DSA enforcement, setting new rules for all market players.
For the first time, the European Commission is publicly demanding not just “intentions” regarding transparency, but concrete changes in policies and algorithms.
Every statement in this investigation is based on official documents, court decisions, and open sources.
This is not a temporary initiative, but the beginning of a systemic era in which user rights and public safety must become just as important to platforms as profit.















