Technology

Meta and TikTok Face EU Fines Over Privacy and Safety Rules

The European Commission has found that Meta and TikTok violate transparency and safety rules in the Digital Services Act, with preliminary findings citing failures in researcher data access, user repo

Martin HollowayPublished 2w ago5 min readBased on 7 sources
Reading level
Meta and TikTok Face EU Fines Over Privacy and Safety Rules

Meta and TikTok Face EU Fines Over Privacy and Safety Rules

On October 24, 2025, the European Commission announced that Meta (which owns Facebook and Instagram) and TikTok have broken rules in the Digital Services Act, a major European law that controls how large social media platforms operate. This is not a final ruling yet, but it signals serious compliance problems that both companies will need to fix.

The Commission identified four main areas where Meta has fallen short: platforms do not share enough information about how they work, they don't give researchers enough data to study their operations, their systems for users to report illegal content are weak, and their appeals processes (when users challenge a deleted post) are not working well.

What Meta Got Wrong

The European Commission started looking into Facebook and Instagram in April 2024 and opened a second investigation in May 2024 focused on how well the platforms protect children. The latest findings come after more than a year of formal scrutiny.

The researcher data access issue is particularly important. The Digital Services Act requires large platforms to let academic researchers access publicly available data so they can study whether the platform causes harm to society. Meta's current system does not give researchers enough access, which means fewer independent studies can check whether Meta is operating fairly.

The user reporting and appeals systems are also failing. When someone sees illegal content on Facebook or Instagram, they should be able to report it easily. When Meta removes a post, users should have a clear way to say Meta made a mistake. According to the Commission, both systems are broken or unclear on Meta's platforms.

A Pattern That Repeats

We have seen this before. When the General Data Protection Regulation (GDPR) — an earlier European privacy law — first came into effect in 2018, major platforms resisted and delayed compliance. They eventually changed their systems. What is happening now with the Digital Services Act follows a similar path: the regulators issue preliminary findings, the companies respond, and operations shift over time.

The Digital Services Act is the EU's biggest attempt yet to control how platforms operate. It covers everything from content moderation to how algorithms work, and it applies to any platform with more than 45 million monthly active users in Europe.

Protecting Children Online

The Commission is also very focused on age verification — making sure children under 13 cannot create accounts. Both Meta and the app Snapchat require users to be 13 or older, but according to the Commission, the platforms are not actually checking ages properly. Younger children are signing up anyway, and the platforms are not removing them once they are discovered.

Meta has not built strong enough systems to verify that new accounts belong to people over 13. Once someone creates an account, Meta is also not assessing whether the content they see is appropriate for their actual age. Snapchat faces the same criticism.

The Commission has started building a shared age verification tool that could work across multiple platforms. The details are still unclear, but the idea is to create a single system that platforms must use rather than each platform building its own.

What Comes Next

The findings are preliminary, so Meta and TikTok have a chance to fix these problems before the Commission issues a final decision. However, the range of issues — spanning transparency, researcher access, user safety, and child protection — suggests both companies have significant work ahead.

The broader picture here is worth understanding. The Commission is moving beyond checking whether platforms have written the right policies. It is now auditing how platforms actually work day-to-day. This marks a shift toward treating large social platforms less like private companies and more like public infrastructure that needs to be transparent and accountable.

The Digital Services Act allows the EU to fine platforms up to 6% of their global annual revenue. For Meta and TikTok, that is billions of dollars. That financial pressure will likely push both companies to prioritize compliance quickly.

These enforcement actions are likely to set a template for how the EU will regulate other major platforms going forward. If you work in social media or tech policy, watch how Meta and TikTok respond — their decisions will probably shape what regulators expect from everyone else.

Meta and TikTok Face EU Fines Over Privacy and Safety Rules | The Brief