Protection of minors online

2025/2060(INI)

The European Parliament adopted by 483 votes to 82, with 86 abstentions, a resolution on the protection of minors online.

Parliament noted that 97% of young people use the internet daily, and that 78% of 13 to 17-year-olds report checking their devices at least once an hour, while 46% do so almost constantly. One in four children or young people exhibits ‘problematic’ or ‘dysfunctional’ smartphone use, meaning behavioural patterns mirroring addiction.

Implementation and enforcement of existing legislation

Parliament underlined that the Digital Services Act (DSA) requires online platforms accessible to minors to ensure a high level of protection in terms of privacy, security, and safety. It welcomed the European Commission's guidelines, which support default protection settings, reaffirm the responsibility of application providers, and advocate for tools such as online safety by design codes and child rights impact assessments. However, it noted that these guidelines are not legally binding and should be improved, particularly with regard to protecting minors from addictive design and profiling and engagement-based recommendations.

The resolution underlined the importance of swiftly implementing and effectively enforcing the Digital Services Act and other relevant EU legislation. It welcomed the Commission's investigations into the protection of minors online and urged the Commission to conclude its investigations promptly and take all necessary measures, including the imposition of fines and effective corrective measures.

Parliament expressed concern about the recruitment of minors by criminal networks on online platforms and is worried about the lack of more ambitious mitigation measures to protect minors, in particular when it comes to notice and action, hyper-personalised and engagement-based recommendation algorithms leading to addictive behaviour, and dark patterns. It is also alarmed by the recent trend of some major online platforms relaxing their strict content moderation practices.

Risk assessments concerning the online safety of minors carried out by very large online platforms and search engines are often inadequate. The resolution called on the Commission to make full use of the tools available under the Digital Services Act to address this problem. Members are concerned about the continued failure of large digital platforms to adequately protect minors on their services and the significant delays in appointing digital services coordinators in several Member States. The Commission is urged to ensure a harmonised approach to the enforcement measures taken by Member States, including through harmonised operational procedures for digital services coordinators.

Parliament encouraged the Commission to strengthen the protection of minors online through the future Digital Fairness Regulation, believing that persuasive technologies used by online actors, such as targeted advertising, influencer advertising, addictive design, loot boxes, in-app currencies and dark patterns, should fall under the upcoming Digital Fairness Regulation, in order to close legal gaps and better protect minors.

Age verification and parental controls

Parliament noted that the disparities in current age assurance measures (verification, estimation, or self-declaration) are leading to a fragmentation of the internal market. It called on the Commission to present, where necessary, appropriate legislative measures to ensure legal certainty and guarantee a harmonised approach to secure and reliable age assurance mechanisms.

Given concerns about risks to children's fundamental rights, Members insisted that any legislation in this area avoid enshrining surveillance practices and prioritise the most effective and least invasive measures. They reiterated that it is the provider's primary responsibility to ensure age assurance mechanisms for minors accessing its services.

Parliament called for a harmonised European digital age limit of 16 as the default threshold under which access to online social media platforms should not be allowed without parental or guardian consent. The same age limit should apply to video-sharing platforms and AI companions that pose risks to minors. It further called for a harmonised European digital age limit of 13, under which no minor may access social media platforms. The Commission is invited to consider introducing personal liability for senior management in cases of serious and persistent non-compliance with the provisions on the protection of minors set out in the Digital Services Act.

The resolution noted that even when parental control tools are used, they are not always easy to find or manage, and minors can easily circumvent them. These tools should be user-friendly, intuitive, and easy for all parents and guardians, including those with disabilities, to understand and find. Platforms must assume greater responsibility in promoting and improving the effectiveness of parental control systems.

In order to address the shortcomings of current European Union legislation, Parliament recommends, among other things:

- the prohibition of the most dangerous addictive practices and the default deactivation of other addictive features for minors (including ‘infinite scrolling’, ‘autoplay’, ‘pull-to-refresh’ disappearing stories, reward loops and harmful gamification practices);

- clarify and strengthen existing legislation governing deceptive interfaces (dark patterns) used by 97% of the most popular websites and applications used by consumers in the EU;

- guarantee a high level of protection for minors who play video games, in particular by prohibiting loot boxes, wheels of fortune, random prize wheels, card games in exchange for real money in games likely to be accessible to minors, as well as taking measures relating to the risks associated with in-app currencies, microtransactions, ‘pay to progress’ and ‘pay to win’ mechanisms;

- strict application of the toy safety regulation and the artificial intelligence regulation (AI regulation);

- the protection of minors against commercial exploitation, in particular by prohibiting platforms from offering financial incentives for ‘kidfluencing’ (minors acting as influencers);

- urgent action to address the ethical and legal challenges posed by generative AI tools, including deepfakes, companionship chatbots, AI agents and AI-powered nudity apps (capable of generating manipulated, non-consensual images).

Lastly Parliament recognised the importance of media and digital literacy in empowering minors to navigate online environments safely and responsibly and to apply critical thinking. It emphasised the need to provide parents and guardians with adequate training and guidance to help them support their children’s digital experience.