Indonesia’s age-restriction policy for social media access needs more than just a decree

By Muhamad Erza Aminanto and Dyah Pitaloka

To ensure that the ban works successfully there is a need to follow Australia’s example and set up an impartial and statutory eSafety Commissioner to keep watch on social media sites. 

Regulating children’s access to social media is not enough without independent oversight, robust infrastructure, and accountability for tech platforms. Image: Canva

The recent enactment of Government Regulation (PP) No. 17 of 2025 in Indonesia, popularly known as PP TUNAS (Tunggu Anak Siap), which restricts access to “high risk” social media platforms for children under 16, from March 28, 2026, marks a significant step in Indonesia’s digital sovereignty.  


This regulation mirrors Australia’s earlier approach through the Online Safety Amendment (Social Media Minimum Age) Act 2024. However, we must sound a note of caution.  


While the policy is commendable, a law is only as effective as the technical and institutional infrastructure that enforces it.  


Our biggest question is: Why does the burden of “fixing” social media fall on children and parents instead of on the companies designing these systems?  


Without a robust, independent oversight body, Indonesia risks creating a "paper tiger" that offers a false sense of security while leaving 80 million young Indonesians vulnerable. 

The scale of the challenge 


The urgency is backed by sobering data.


According to the 2025 APJII (Indonesian Internet Service Providers Association) survey, internet penetration among Generation Alpha (born 2010–2024) has reached 79.73 per cent, while Generation Z stands at 87.8 per cent.  


More alarmingly, UNICEF Indonesia (2023) reports that 42 per cent of children have felt uncomfortable or frightened by online experiences, and over 48 per cent have experienced cyberbullying. 


This issue, rather than being placed in isolation from its macro context, must be positioned against the bigger structure and factors that together shape the interactions between children and social media.  


Children who are doomscrolling past bedtime are not making a choice.


Social media platforms are designed to make stopping almost impossible, and the children are responding to it. The design architecture of social media platforms, such as infinite scroll, autoplay videos, and algorithmic feeds maximises engagement.  


We are not just regulating a hobby; we are regulating the primary environment where Indonesia’s youth now live.

Learning from the ‘gold standard’ 


Indonesia's move emulates Australia's lead, but the latter’s success isn't just in the ban. It’s in the eSafety Commissioner.  


This independent statutory office is a blueprint for functional oversight.  


Muhamad Erza Aminanto and Dyah Pitaloka emphasised the importance of an independent committee tasked with overseeing the implementation of regulations and acting as a neutral arbiter between citizens and Big Tech. Image: Monash University, Indonesia

In its 2023-2024 performance report, the Commissioner successfully removed over 80 per cent of reported cyberbullying and image-based abuse content, with 70 per cent of complaints triaged within just three hours. 


The Australian model works because it has a dedicated team of digital forensic investigators and legal powers to issue formal removal notices. They started with “understanding the problem”.  


In their toolkit “Safety by Design”, eSafety Commissioner identified the different ways platforms and services can be exploited to facilitate child exploitation and abuse online, either indirectly or in conjunction with other platforms.  


This type of thorough assessment should be the first step that the Indonesia government must do in order to implement a robust safety-by-design approach.  


For Indonesia to succeed, we must move beyond "regulatory FOMO" and build the functional architecture behind the law. 

The missing pillars: Infrastructure and independence 


Currently, Indonesia lacks three critical pillars required to turn PP TUNAS into a reality by March 2026:


  1. An independent oversight committee: Protection cannot be an "extra task" for existing ministries. We need a body, similar to the eSafety Commissioner, that is free from political volatility to act as a neutral arbiter between citizens and Big Tech. 
  2. Centralised reporting infrastructure: Where does a parent go when a platform fails to verify an age? Current mechanisms are fragmented. We need a high-throughput digital forensics pipeline that can handle millions of reports transparently. 
  3. Audit and remediation: We must have the technical capacity to audit the Age Verification (AV) technologies used by platforms. Without independent auditing, platforms might opt for "checkbox" solutions that either fail to protect children or, conversely, collect excessive biometric data, creating new privacy risks. 

The risk of selective enforcement 


The most critical challenge in the context of regulation is the perception of selective enforcement. In the realm of cybersecurity, a firewall that only filters some traffic is not a firewall; it is a vulnerability.


If PP TUNAS is enforced inconsistently - penalising certain platforms while ignoring others based on political or economic clout – it will lose all credibility.  


As the Minister of Communication and Digital Affairs, Meutya Hafid, recently stated, "No digital economy is worth sacrificing child safety."  


This principle must be applied universally.  


From global giants like Meta and TikTok to local providers, the rules for age verification and data protection must be ironclad and impartial. 


Comprehensive regulation is essential considering the pervasiveness of algorithmic systems. This includes ensuring transparency and auditability, effective reporting and redress mechanisms, children’s rights risk assessments, independent audits, and restrictions on targeted advertising 

A call for digital realism 


The Indonesian government must shift its focus from the "what" (the law) to the "how" (the technology). We need to invest in:


  • Privacy-preserving age verification: Ensuring that verifying a child's age doesn't lead to even more intrusive data collection. 
  • Technical capacity building: Training investigators specifically for the e-safety and digital forensics domain. 
  • Public-private accountability: Establishing clear Service Level Agreements (SLAs) for social media companies regarding child safety responses. 

If the commitment is to protect young people online, “how” should be implemented in the government's commitment to also regulate social media as a powerful media industry. 


Banning children’s access to social media would shift the responsibility for safety from the platforms that create the environment to the children who navigate it, their parents, and educators.  


It is imperative that regulation should require platforms to prevent and mitigate risks to children’s rights by design and by default and hold platforms accountable for failures. 


PP TUNAS is a necessary foundation, but it is not the finished building. Emulating Australia’s age limits without emulating their eSafety infrastructure is like installing a high-tech lock on a door with no walls.  


For the sake of Indonesia’s "Golden Generation," we must ensure that, we don't just have a new regulation - we need to aim to have a functioning, fair, and formidable digital ecosystem. 



Muhamad Erza Aminanto is Assistant Professor Cyber Security at Monash University, Indonesia. His current research interests include information security, artificial intelligence, anomaly detection, intrusion detection, cybersecurity, digital transformation, and smart cities. 


Dyah Pitaloka is Associate Professor Digital Communications & Marketing at Monash University, Indonesia. Her research explores issues related to social, cultural, and policy dynamics of digital technology and how these aspects influence individuals and groups’ wellbeing, especially among marginalised communities.