To ban or not to ban: EU countries debate social media age limits

Jul 23, 2025 - 11:01
 0
To ban or not to ban: EU countries debate social media age limits

As online risks for children draw increasing scrutiny across Europe, policymakers are stepping up efforts to introduce stricter age verification rules and protect children online. From EU-level guidelines to national pilot programmes for age assurance technology, the debate over how – and how far – to go is accelerating.

The most far-reaching proposal pushed by several member states, including France and Spain, is to ban children under a certain age from accessing social media altogether. 

Proponents say it’s necessary to protect children under the age of 15, or 16 in Spain’s case, from the harmful effects of social media. They point to studies linking social media to anxiety, depression, and low self-esteem as well as cyber bullying and online predators.

Most social media platforms like Facebook, Instagram, and TikTok set the minimum age to create an account at 13. But in practice, it’s easy for even younger children to bypass those rules by simply lying about their age – and many do.

However, a European Commission spokesman made it clear last month that it had no intention of introducing an EU-level age ban, but did leave the door open for national governments to bring in their own legislation if they so wished.

Age verification is coming to five EU countries

That doesn’t mean the EU is twiddling its thumbs. 

The Commission is testing a prototype of an age verification app which will initially be rolled out in Denmark, France, Spain, Greece and Italy, it announced earlier this month.

Verification should make it possible to anonymously check the age of users without storing personal data such as their name or date of birth.

The long-term plan is to integrate the technology into the digital EU ID card (eID) – a type of official online proof of identity that will be available from the end of 2026.

Romania is also looking at getting stricter on age verification and children’s access to social media. It is currently debating a bill aimed at protecting minors from harmful content on Very Large Online Platforms (VLOPs) – platforms or search engines that have more than 45 million users per month. 

The proposed legislation would require platforms to implement strict age verification, enable parental controls, issue monthly activity reports to guardians, and swiftly respond to authorities’ alerts about harmful content. It targets material inciting violence, promoting eating disorders or self-harm, and exposing minors to nudity or illegal behaviour. Platforms would also be banned from monetising live content featuring minors without consent. Non-compliance could result in fines of up to 3 percent of global turnover. A parliamentary report is due on September 3.

In France there has been a crackdown on minors accessing pornography. France’s highest administrative court ruled last week that major pornographic websites like Pornhub and Youporn must implement age verification to block underage access, overturning a previous suspension. 

The government has pushed for enforcement under a 2024 law, citing figures showing over half of 12-year-old boys access such sites monthly. Platforms argue the rules breach EU law and raise privacy concerns, proposing tech companies like Apple or Google handle verification. France’s regulator instead backs a “double-blind” third-party system to protect user anonymity.

Denmark wants to lead the way

Denmark, which took over the rotating EU Council Presidency this month, has pledged to prioritise online child protection during its six month term.

“It’s hard to imagine a world where kids can enter a store to buy alcohol, to go to a nightclub by simply stating that they are old enough, no bouncers, no ID checks, just a simple ‘Yes, I am over the age of 18’,” Danish Digital Minister Caroline Stage Olsen said.  

“Children deserve a safe digital childhood. This is one of the main priorities for me during the Danish Presidency. Without proper age verification, we fail to protect children online,” she added.

The EU also published recommendations under the Digital Services Act (DSA) to online platforms to ensure the safety of children and prevent their exposure to dangerous behaviour.

These include removing “addictive” features such as “read receipts” which tell users when an individual has seen their message, making it easier for minors to block or mute users and preventing accounts from downloading or taking screenshots of content.

The EU also recommended platforms turn off notifications by default, especially during sleeping hours, limit apps’ access to photos or turn off the camera by default.

Another element is online grooming: platforms will have to set minors’ accounts as private by default, i.e. not visible to users who are not on their friends list, to minimise the risk of them being contacted by strangers.

Still, some national leaders say more binding measures are needed. Belgian Minister for Digitalisation Vanessa Matz called the guidelines a step in the right direction, but believes the EU should dare to go even further.

“The guidelines only impose strict age verification for platforms offering alcohol, gambling, or pornographic content. For other platforms – even those with a minimum age of 13 or 16 – the Commission restricts itself to recommending age verification, without making it mandatory. However, the Commission is opening the door to true age verification on social media through national legislation,” she said. “I encourage Belgium to seize this opportunity. This framework will form the basis for the parliamentary debate after the summer, to develop legislation geared to digital challenges.” 

Eurochild, a network of organisations promoting children’s rights across Europe, argues that age assurance shouldn’t be about blocking kids from social media, but identifying when a child is using a platform and tailoring protections to their needs and used as part of a wider toolbox to protect children online.

“So for us, in general, bans don’t work. In general from the practical side because the technology, the age assurance technology is not there yet, but also it goes against children’s rights overall,” Fabiola Bas Palomares, Lead Policy and Advocacy Officer Online Safety at Eurochild, said last month before the latest guidelines from the EU had been announced.

“Right now the debate is really focused on safety and these bans come from a place where policymakers are a little bit tired of online platforms not complying and not really providing that safety that they are supposed to do by law.”

She argues that children have a right to access information and to play and that the focus should be on identifying and addressing the harms children face online by making platforms comply with the DSA rather than an outright ban.

Phone bans in school are also on the table

Several member states are also targeting children’s screen time at school. 

At the beginning of July, the Slovenian National Assembly passed changes to the Primary School Act restricting the use of electronic devices during teaching time in schools. Mobile devices will be permitted only when educationally essential.

Approved without a vote against, the changes also introduce IT as a compulsory subject. The new compulsory IT and digital technologies subject for year 7 pupils – Slovenian primary school has a nine-year programme – aims to provide foundational digital literacy.

In Bulgaria, the Ministry of Education is pushing to ban phones in schools outright, citing the impact on learning, attention spans, and children’s cognitive and emotional development. The proposed legislation, which would allow devices only for educational or medical use, is pending parliamentary approval.

As European countries experiment with different approaches, one thing is clear: the race to protect children online is accelerating and the outcomes may reshape digital childhoods across the continent.

Fact-check: Children under 14 won’t be fined for using a phone

A TikTok video in German claims that smartphones could be banned by law for all children and adolescents under the age of 14 from April 2025. The video claims they will be fined 500 Euro if they are caught using one in public or in school.

The factchecking team at dpa has debunked the claim. The German Ministry of Justice confirmed there is no such law and in any case children under 14 cannot be fined in Germany.

Read the full fact-check here.

This article is published twice a week. The content is based on news by agencies participating in the enr.