European Commission Finds Meta and TikTok in Preliminary Breach of Digital Services Act Transparency Rules
The European Commission issued preliminary findings on October 24, 2025, that Meta and TikTok violated Digital Services Act transparency obligations, marking significant enforcement action in the EU's

European Commission Finds Meta and TikTok in Preliminary Breach of Digital Services Act Transparency Rules
The European Commission on October 24, 2025, issued preliminary findings that TikTok and Meta have violated transparency obligations under the Digital Services Act, marking the latest enforcement action in the EU's sweeping regulatory framework for large online platforms. The preliminary assessment targets multiple compliance failures across Meta's Facebook and Instagram properties, while also implicating TikTok in transparency breaches.
The Commission's preliminary findings identify four specific areas where Meta appears to fall short of DSA obligations: inadequate transparency reporting, insufficient researcher data access, deficient user reporting mechanisms for illegal content, and problematic content moderation appeals processes on both Facebook and Instagram.
Multiple Fronts of Non-Compliance
The transparency violations represent the culmination of formal proceedings that began in spring 2024. The Commission launched its initial formal investigation into Instagram and Facebook on April 30, 2024, examining deceptive advertising practices, political content handling, notice and action mechanisms, and researcher data access protocols. A second wave of proceedings opened on May 16, 2024, specifically targeting child protection measures on both platforms.
Meta's researcher data access failures particularly underscore the DSA's emphasis on platform accountability through external oversight. The Act requires very large online platforms to provide researchers with adequate access to publicly available data for studies examining systemic risks. The Commission's preliminary finding suggests Meta's current data sharing arrangements do not meet this threshold, potentially hampering academic and policy research into platform impacts.
The user notification and appeals mechanisms represent operational compliance gaps that affect millions of European users daily. Under DSA requirements, platforms must provide straightforward ways for users to flag illegal content and meaningful processes to challenge moderation decisions. The Commission's assessment indicates these systems on Facebook and Instagram do not sufficiently meet regulatory standards.
Historical Pattern of Platform Resistance
We have seen this pattern before, when major platforms initially pushed back against substantive regulatory frameworks before eventually adapting their operations. The progression from the General Data Protection Regulation's rollout through current DSA enforcement follows a familiar trajectory: preliminary resistance, regulatory pressure, and eventual operational adjustment.
The broader regulatory context here extends beyond individual platform compliance. The DSA represents the EU's most comprehensive attempt to regulate platform behavior at scale, covering content moderation, algorithmic transparency, and systemic risk assessment across platforms with more than 45 million monthly active users in the EU.
Child Protection Remains Central Focus
The Commission's ongoing scrutiny of age verification mechanisms reflects persistent concerns about underage access across major platforms. Both Meta and Snapchat face formal investigations over their ability to prevent users under 13 from creating accounts, despite platform policies requiring users to meet this minimum age threshold.
Meta's own terms of service establish 13 as the minimum age for Facebook and Instagram accounts, yet the Commission's assessment suggests the company lacks effective measures to prevent younger children from signing up or to identify and remove them after account creation. The regulatory critique extends to risk assessment failures, with the Commission finding inadequate evaluation of age-inappropriate content exposure for users under 13.
Snapchat faces similar scrutiny, with the Commission questioning whether the platform's age assurance systems adequately verify user age during account creation. The platform's minimum age requirement of 13 mirrors industry standards, but implementation appears insufficient under DSA standards.
Technical Implementation Challenges
The Commission's recent unveiling of an age verification application signals recognition that current platform-level solutions may prove inadequate. The centralized approach could standardize age confirmation across platforms, though implementation details and privacy implications remain unclear.
These enforcement actions arrive as platforms navigate complex technical challenges around content moderation, age verification, and researcher data access. The DSA's requirements for algorithmic auditing, systemic risk assessment, and transparent reporting demand significant operational changes that extend well beyond simple policy updates.
The preliminary nature of these findings means Meta and TikTok retain opportunities to address identified deficiencies before final determinations. However, the breadth of alleged violations across multiple DSA pillars suggests substantial remediation work ahead.
Implications for Platform Operations
Looking at what this means for the broader platform ecosystem, these preliminary findings establish precedents for DSA interpretation and enforcement that will likely influence regulatory approaches toward other major platforms. The Commission's willingness to pursue multiple simultaneous proceedings against Meta demonstrates an aggressive enforcement posture that extends beyond individual compliance failures to systemic operational practices.
The researcher data access requirements particularly signal a shift toward treating platforms as quasi-public infrastructure with corresponding transparency obligations. This evolution reflects growing recognition that platform operations generate societal impacts that warrant external oversight and academic scrutiny.
Platform responses to these preliminary findings will likely influence both immediate compliance costs and longer-term regulatory strategies. The DSA's penalty structure allows fines up to 6% of global annual turnover, creating substantial financial incentives for prompt remediation of identified issues.
The timing of these enforcement actions, nearly two years after DSA implementation began, suggests the Commission has moved beyond initial compliance assessments toward detailed operational auditing. This progression indicates that platform operators should expect continued scrutiny of their risk management, content moderation, and user protection systems under the expanding regulatory framework.


