The protection of children’s rights in the digital space is no longer just a narrow compliance issue. For digital service providers, it is now a fundamental legal and organizational matter regulated by a web of various EU laws. If an online platform can be accessed by children, compliance does not stop at a privacy policy or general terms and conditions.
In such cases, the platform must prepare several interconnected documents and implement appropriate measures to ensure the protection of children. These documents and measures include:
- Privacy notices that are understandable to children (or specifically addressed to them).
- Documentation of parental consent and the appropriate process for obtaining such consent.
- Systemic risk assessments and/or data protection impact assessments that prioritize the protection of children’s rights.
- Appropriate age-verification mechanisms.
- General precautionary measures to be integrated into the operation of the service.
The question, therefore, is not whether a platform “has a privacy policy,” but whether its entire operational logic is aligned with the protection of minors. This is particularly important for services such as TikTok, Instagram, YouTube, Roblox, Snapchat, and any platform (e.g., online gaming platforms, social media sites, educational platforms, forums, chatbots) that has minor users.
DSA (Digital Services Act)
Article 28 of the Digital Services Act states that online platforms accessible to minors must implement appropriate and proportionate measures to ensure a high level of privacy, safety, and security for minors. Based on this, the operation of the platform must actually reflect the risks posed to minors.
Furthermore, the article stipulates that profile-based advertising cannot be displayed to a user whom the platform knows with reasonable certainty to be a minor. In addition, the largest platforms must specifically examine systemic risks affecting children’s rights in the risk assessment required by Article 34.
In practice, this means it is not enough for a platform to use an age-gate or a general “18+” label. The design, content display, advertising logic, and moderation mechanisms must also be defensible from a child protection perspective.
Data Protection and GDPR (General Data Protection Regulation) Compliance
Article 8 of the GDPR sets specific rules for children’s consent. If an information society service is offered directly to a child, the processing of personal data may only be based on independent consent above a specified age; below that age, parental authorization is required. The provider must also take reasonable steps to verify that consent was properly obtained. Such services include, for example, educational apps for children, gaming platforms, or any online social service. This provision is especially critical for any platform involving registration, age declaration, profiling, tracking, or targeted content delivery.
Article 22 of the GDPR further restricts a provider’s options regarding automated decision-making and profiling. This is significant because recommendation systems, behavioural advertising, and automated moderation can directly influence a child’s behaviour and development. Consequently, these practices are generally prohibited for children and can only be used in very narrow cases – exclusively in the child’s best interest.
Under the GDPR, appointing a Data Protection Officer (DPO) is not automatically mandatory for platforms accessible to children, but only if the conditions under Article 37 are met (e.g., large-scale, regular monitoring). Nevertheless, such a service provider is typically required to conduct a data protection impact assessment, particularly if the processing of children’s personal data involves profiling, automated decision-making, or marketing, or the recommendation of information society services offered directly to them..
It is also important to note, however, that data protection supervisory authorities, such as the National Authority for Data Protection and Freedom of Information (NAIH) in Hungary, generally expect greater care when processing children’s personal data, including, for example, providing children with information that is as understandable as possible – taking into account their age and circumstances – and giving increased consideration to the children’s best interests.
AI Act
The AI Act mandates a risk-based approach and fundamental rights protection for artificial intelligence systems. If a platform uses AI-based recommendation systems, chatbots, moderation tools, or age estimation, these features require a specific risk approach.
The Commission’s 2025 guidelines on the protection of minors also emphasize that platforms must implement safer default settings, effective age verification, and solutions that avoid harmful patterns. This is particularly vital for AI functions, as the technical solution itself can be a risk factor if it affects vulnerable user groups.
Compliance with consumer protection requirements
The directive on unfair commercial practices and the transposing Hungarian legislation is essential because children are in a position requiring special protection as consumers. The directive prohibits misleading or aggressive commercial practices. In practice, this extends to manipulative interface designs (dark patterns), urgent messaging, and marketing that exploits children’s vulnerability. This is especially important where the platform’s revenue model is based on virtual currencies, micropayments, reward mechanisms, or behaviour-inducing design.
It is evident that child protection does not stop at the level of data protection or platform regulation: consumer protection rules are equally part of compliance.
Systemic Compliance
For platforms accessible to children, compliance is actually determined on three levels:
- Documentation: Clear rules and notices tailored specifically to this target group.
- Operation: The platform should be designed to reduce risks, not encourage harmful behaviour, and does not use solutions that manipulate children or have a detrimental effect on their development and behaviour.
- Organizational Compliance: Where necessary, there should be a DPO, risk assessments, impact assessments, and internal rules specifying who is responsible for adhering to data protection, platform, and consumer protection requirements, and who is accountable for their proper implementation.
This should be treated not as a retrospective correction, but as part of the platform’s entire legal and technical architecture.
When to Contact Us?
At SimpLEGAL, we help translate these obligations arising from the legislative maze into a functional operational framework. This includes:
- Reviewing child-oriented functions.
- Assessing whether current consent and age-verification practices comply with Article 8 of the GDPR.
- Analysing automated decision-making based on Article 22 of the GDPR.
- Aligning platform risk assessments with Articles 28 and 34 of the DSA.
Furthermore, we provide advice on AI management, risks of unfair commercial practices, internal policies, and documentation for regulatory and supervisory authorities.
Key Takeaways
The digital protection of children no longer consists of a single law, but rather the coordinated application of multiple layers of EU and Hungarian legislation. Anyone operating a platform accessible to children must not only process data but also manage the above referred risks.
The illustrations were created using Google Gemini.