Social Media News

World Governments calling for tighter regulations on minors’ access to social media

World Governments calling for tighter regulations on minors’ access to social media
———————————————-
Governments around the world are moving to tighten regulations on children’s and adolescents’ use of social media, citing growing concerns over digital addiction, exposure to harmful content, and the impact of excessive screen time on mental health and academic performance.

In recent years, educators and parents have reported that many children spend long hours on digital applications, affecting concentration in classrooms and contributing to fatigue and mental distraction. These concerns have prompted legislative and policy responses aimed at limiting minors’ access to social media platforms.

Several parliaments have approved or debated laws banning social media use for children under 15. France has introduced legislation prohibiting mobile phone use in secondary schools, while Australia moved in December 2025 to ban social media access for those under 16. Malaysia has announced plans to impose a similar restriction from 2026.

Across Europe, countries including Germany, Italy, Spain, and Greece have begun enforcing stricter legal limits on minors’ use of smartphones and digital applications. The United Kingdom has also said it is considering restrictions on younger teenagers’ access to social media as part of broader child-protection efforts.

According to a 2025 report by UNESCO on children’s well-being in a digital world, 56% of children now view the internet as essential for staying in touch with friends, up from 50% in 2023. However, the report found that 24% of at-risk children reported distressing online experiences, compared with 10% a year earlier, while the share of children who feel safe online fell from 81% to 77%. UNESCO linked intensive screen use to higher levels of anxiety, depression, and reduced concentration among children and adolescents.

The UNICEF said it is working with governments to strengthen legal protections and with technology companies to improve digital safety standards for children.

Major platforms such as TikTok, Facebook, and Snapchat set a minimum user age of 13, but child-rights advocates argue these measures are insufficient, noting that many younger children maintain active accounts.

While European countries increasingly favor binding legislation, approaches in the Arab world remain varied, relying largely on awareness campaigns and parental responsibility rather than strict legal frameworks. Experts say a comprehensive response combining regulation, digital literacy, and community engagement is needed to balance the benefits of technology with the protection of children.

Related Articles

Leave a Reply

Back to top button