Worldwide Shift in Social Media Access, New Rules Block Kids from Social Networks

Several countries have introduced or proposed rules to limit children’s access to social media. These measures aim to address concerns over mental health, cyberbullying, and excessive screen time. Platforms must now verify user ages, often through ID checks or biometrics. Australia implemented the first nationwide ban in late 2025. Others, including France, Malaysia, Denmark, and Norway, have followed with plans or partial steps. As of today, only Australia officially enforces a full restriction. 

Australia’s law took effect on December 10, 2025. It prohibits users under 16 from accounts on platforms such as Facebook, Instagram, TikTok, Snapchat, YouTube, X, Reddit, Twitch, and Kick. Gaming apps, messaging services, and sites for health or education remain exempt. Companies must use reasonable methods like ID scans or video selfies with facial recognition to block minors. Fines reach $33 million (AUD $49.5 million), for companies that fail to comply. Parents and children face no penalties. Initial enforcement removed some accounts, though users report workarounds such as VPNs or false age entries. 

France advanced a similar policy in January. The National Assembly passed a bill last week to bar children under 15 from social media. The measure voids contracts between platforms and minors, aligning with the European Union’s Digital Services Act. Verification starts for new accounts in September 2026 and extends to all users by year end. This affects adults as well, since platforms must check broadly. Data shows 90% of French youth aged 12 to 17 use social media daily on smartphones for two to five hours. Issues like reduced self esteem and exposure to harmful content fuel the effort. President Macron supports acceleration. The proposal now goes to the Senate and may require review by the European Court of Justice. 

Malaysia approved a 16 year minimum age in late 2025 under its Online Safety Act. The original plan targeted January 1, 2026, with eKYC verification using government IDs. Platforms would enforce blocks for younger users. Delays pushed testing into early 2026, with full rollout now expected around July. No ban operates yet. 

Denmark agreed in November 2025 to restrict access for those under 15, potentially allowing exemptions for 13 and 14 year olds. The policy aims for 2026 implementation, but no legislation has passed or activated by February.

Norway tabled a bill requiring a 15 year minimum, with parental consent as an option. It remains under discussion without approval or a start date. 

Few other nations match these efforts. China applies screen time caps through minor modes. Italy and Germany mandate parental consent for users under 14 to 16. The EU considers a 16 year limit bloc wide, with consent possible for 13 to 15 year olds. Britain prioritizes content controls over age bans. 

Social media companies now navigate complex compliance. Age verification systems demand investment in technology and raise privacy questions. Inconsistent rules across borders create operational challenges. Fines threaten revenue, particularly for platforms reliant on young demographics for growth and advertising. Large operators develop shared tools, while smaller ones risk falling behind. Early data from Australia suggests partial success but persistent evasion tactics.

Legal experts predict more countries will adopt similar frameworks as evidence on youth harms grows. Platforms prepare for expanded verification, which could alter user acquisition strategies. Enforcement effectiveness will shape future policies. 

Related posts

Subscribe to Newsletter