EU accuses Meta of failing to keep children off Facebook and Instagram

The EU has accused Meta of failing to stop children under 13 from accessing Facebook and Instagram, in violation of the Digital Services Act. A fine of up to 6% of global revenue is at stake.

meta journalism bill giphy portal video-calling

The European Commission announced on 29 April that Meta has failed to prevent children under the age of 13 from accessing Facebook and Instagram, in violation of the bloc’s Digital Services Act. The preliminary finding puts Meta at risk of a fine of up to 6% of its global annual revenue.

The Commission said Meta’s platforms allow minors to create accounts by entering false birth dates, with no effective system in place to verify the age of new users or to detect and remove underage accounts after they have been opened.

EU regulators also found that Meta was not adequately assessing the risk of children being exposed to age-inappropriate content on both platforms, as reported by NPR.

What the Digital Services Act requires

The Digital Services Act (DSA), which came into full force for large platforms in 2023, places specific obligations on online services to protect minors.

These include robust age verification at the point of sign-up, systems to detect and remove underage users who have already created accounts, and ongoing risk assessments for content that minors may encounter.

The European Commission’s finding is that Meta has not met those obligations on any of the three counts.

Meta disputed the preliminary findings in a statement, saying the company has measures in place to detect and remove underage accounts and to limit the content accessible to younger users.

Meta now has an opportunity to respond formally to the Commission before a final ruling is issued. If the Commission confirms the findings and proceeds to a penalty, the maximum fine under the DSA is 6% of a company’s worldwide annual revenue.

What this means beyond Europe

The Meta ruling is the latest in a sequence of regulatory actions globally targeting the safety of minors on social media platforms.

Australia passed legislation in 2024 banning children under 16 from social media entirely, and several US states have enacted age-verification requirements.

South Africa does not yet have equivalent legislation, though the Protection of Personal Information Act (POPIA) places data handling obligations on platforms operating in the country that include protections for children’s personal data.

For South African parents and schools, the practical implication of the EU finding is that the platform’s own internal safeguards are, by the assessment of one of the world’s most powerful regulatory bodies, not working as described.

The DSA process is unfolding in Europe, but the platforms and the age-verification gaps at the centre of it operate identically in South Africa.

The Commission’s final ruling against Meta will set the threshold for what constitutes DSA compliance and will likely shape how other regulators, including those developing child online protection policy in South Africa, measure the adequacy of social media platforms’ own safeguards.