IEEE SA, with UNICEF and the Greek Ministry of Digital Governance, hosted a UN WSIS+20 High-Level Meeting Side Event on Designing Responsibly, focusing on age-appropriate standards in the Digital Era.
If you’ve ever watched a child fall into the endless loop of autoplay → scroll → recommendation → repeat, you already know the conversation about children online can’t be reduced to “just use parental controls.”
This workshop made a clear case: protecting children in digital spaces requires shared responsibility, measurable interventions, and systems designed for well-being, not maximum engagement.
The workshop provided a practical tour through three complementary lenses: national policy, children-centered AI and data governance, and standards as the bridge from principles to practice.
Greece’s “Digital Vulnerability Triangle” (and Why It Matters)
Marinos Tsokas, speaking on behalf of Greece’s Ministry of Digital Governance, opened with a candid premise: digital addiction doesn’t happen by accident. He framed the risk as a “triangle of digital vulnerability,” driven by:
- Algorithmic profiling: Platforms track children in real time and feed hyper-personalized content.
- Addictive design: Features like infinite scroll, autoplay, and gamification are engineered to prevent disengagement.
- Digital asymmetry: A mismatch between “tech giants with unlimited resources” and families without equivalent tools, time, or clarity—made worse by complex privacy policies.

The result is not only lost time; it’s a growing set of mental-health and developmental concerns—from anxiety and depression to sleep disruption—paired with a digital environment that prioritizes engagement as the main success metric.
Marinos noted that we need to redefine success: not “more engagement,” but “healthy disengagement.” To do that in Greece, he introduced their Three-Pillar Strategy: Regulation, Tools, and Education
Greece’s response is designed as a holistic national strategy—not a fragmented “good luck, parents” approach. Their framework rests on three pillars that must work together:
- Legislation & regulation (mandatory red lines)
- Technological tools (scalable protections)
- Education & awareness (fast-moving adaptation)
Some of the Greek policy proposals discussed included:
- Moving beyond ineffective “I’m 18” pop-ups toward cross-platform age assurance with robust verification mechanisms
- Advocating for a digital age of consent at 15 across Europe
- Targeting “architectures of addiction” by restricting features like infinite scroll and autoplay, and adding time/content boundaries
- Requiring child-comprehensible language in registration and policies
- Establishing accessible reporting mechanisms when things go wrong
- Introducing a “right to reset” so minors can wipe behavioral history and reduce profiling-based recommendations
- Flagging AI-generated content and bots so children know when they’re interacting with synthetic media or systems
Importantly, Greece emphasized that the strategy is not static: it conducts annual impact assessments and recalibrates the national strategy through surveys and monitoring.
A major highlight was Greece’s effort to pair policy with public infrastructure to use technology to deal with technology. Tsokas described a state-backed application (“Kids Wallet”) created to help parents:
- Control access and regulate screen time
- Use government-backed identity verification pathways (reducing the need to upload IDs to private platforms)
- Store a form of age proof that can be presented without exposing a full personal ID
On the education side, Greece also created a central resource hub called PARCO (“recreational space” in Greek): a one-stop site for parents and children to access guidance without having to hunt through scattered sources. A “school day” focused on living in the digital world was also mentioned to boost digital literacy.
UNICEF’s Core Point: You Can’t Talk About AI Without Talking About Data
Jasmina Byrne of UNICEF widened the lens: the internet wasn’t built with children in mind, and even as child-targeted services proliferate, children’s rights and needs still too often remain peripheral.
Jasmina identified three critical gaps:
- Rights gap: Children’s rights aren’t consistently prioritized in tech design and governance.
- Participation gap: Children rarely have meaningful input into the tools shaping their lives.
- Knowledge gap: We’re still catching up on how digital systems affect children’s development, opportunity, and well-being over time.
UNICEF’s approach centers on the UN Convention on the Rights of the Child and frames AI governance around three critical rights:
- Protection (safety and safeguarding)
- Provision (using AI for children’s benefit—education, health, support)
- Participation (children having a voice in policy, design, and deployment)

From there, Jasmina emphasized a key structural reality: the next phase of AI will be defined by data infrastructure, cross-border data flows, and computational capacity—all of which have real consequences for children because so much of the data fueling these systems is children’s data (from education platforms, health tools, and everyday apps).
One example presented came from a cited study showing that a 16-year-old’s mobile app data was transmitted to 114 different actors, many of whom were unknown to the child, often for their advertising and economic gain.
The underlying message: children’s data is not just “smaller adult data.” Children are more vulnerable, less able to understand long-term consequences, and “consent” alone is an inadequate safeguard when systems are opaque and incentives are misaligned.
IEEE’s Role: Standards as the Bridge Between Values and Implementation
Moira Patterson grounded the discussion in a practical question: once we agree on principles, how do we implement them consistently? That’s where standards come in.
Standards are one of the most effective tools for turning broad commitments—privacy, safety, well-being—into repeatable processes, shared definitions, and market-ready practices. Moira noted that standardization can be supported by additional mechanisms like certification, conformity assessment, registries, open-source solutions, and training.

Moira also placed this work in a global context: policy momentum is accelerating worldwide (Age-Appropriate Design Codes, the EU Digital Services Act, US policies, African Union initiatives, and newer national efforts). But policy often lags innovation—meaning standards can help fill the “in-between” space by creating implementable frameworks while laws evolve.
A central theme emerged: if we designed digital services with children in mind from the beginning, we wouldn’t need to rely on regulatory interventions and exclusions later.

Panel Discussion Key Takeaway: Implementation Is the Hard Part—and Cooperation Is Non-Negotiable
During the panel Q&A, the speakers converged on two recurring implementation challenges:
- Coordination across pillars and stakeholders (regulation, tools, and education must reinforce each other)
- Speed mismatch (technology evolves faster than policymaking)
Education and awareness were repeatedly positioned as the “fastest moving” lever—essential not only for parents and children, but also for teachers, under-resourced regulators, and startups that may not even realize the risks they introduce.
One panelist summarized responsibility with an old saying: “It takes a village to raise a child.” In the AI era, that village includes families, educators, policymakers, regulators, platforms, developers, standards bodies, and international organizations—and the village needs shared rules of the road.
The discussion also touched on high-profile policy actions like Australia’s social media ban, noting both the enforcement challenges (kids find loopholes) and a key intent: shifting responsibility from children to platform companies to do what’s reasonably within their power to prevent harm.
The Key Outcome from the workshop discussion is that Child Safety is not enough, we also need to focus on Child Digital Rights and Well-Being
Across all presentations, one key outcome from the workshop discussion is that we need to move from Child Safety to Child Digital Well-Being. The ambition is digital well-being—a healthy relationship with technology that respects children’s rights in every interaction.
That means:
- Designing platforms where disengagement is possible and supported
- Limiting manipulative architectures of addiction
- Giving children understandable interfaces, real choices, and meaningful protections
- Governing children’s data as a special category—not a marketing resource
- Creating standards and tools that make “responsibility by design” feasible at scale
- Staying flexible and continuously recalibrating as technology changes
The workshop concluded that this is “the start of the journey”— and the path will keep changing.
The work ahead is less about finding one perfect fix and more about building a system that can keep learning, adapting, and protecting children as the digital world evolves.
Participants on the panel were Jasmina Byrne, Chief of Foresight and Policy, UNICEFF; Steven Wosloo, Digital Policy Expert, UNICEF; Moira Patterson, Global Market Affairs & Standards Partnership Director, IEEE SA; Karen Mulberry, Senior Manager Technology Policy, IEEE SA; and Marinos Tsokas, Advisor to the Secretary General, Telecommunications and Post of the Ministry of Digital Governance, for Electronic Communications and Digital Policy Issues.




