Denmark to Ban Social Media for Children Under 15

31

Denmark is leading the way in Europe with a new agreement to ban social media access for children under 15, aiming to shield young users from the risks of harmful content and commercial exploitation online. This marks a significant escalation of efforts to regulate the tech industry and protect vulnerable populations in an increasingly digital world.

Addressing Concerns and Intensifying Pressure on Tech Platforms

While many tech companies already set age restrictions on their platforms, these are often circumvented, rendering them largely ineffective. The Danish initiative responds to growing concerns about the pervasive influence of social media on children— particularly regarding exposure to violence, self-harm, and manipulative algorithms. Officials and experts believe stronger action is needed to address these issues.

The Scope of the Legislation and Enforcement Challenges

The legislation is not absolute; parents can grant access to their 13- and 14-year-olds following an assessment. However, significant questions remain regarding enforcement. A swift rollout is unlikely, as lawmakers are expected to spend months crafting legislation that minimizes loopholes for tech companies. The Danish government acknowledges the immense pressure exerted by these companies’ business models.

Denmark’s Approach: Leveraging Technology and EU Regulations

Caroline Stage, Denmark’s Minister for Digital Affairs, highlighted the stark reality: 94% of Danish children under 13 and over half of those under 10 already maintain social media profiles, spending considerable time exposed to potentially harmful content.

“The amount of time they spend online — the amount of violence, self-harm that they are exposed to online — is simply too great a risk for our children,” Stage stated.

Denmark intends to leverage its national electronic ID system (nearly all citizens over 13 have one) and is developing an age-verification app. Though Denmark cannot force tech companies to use its app, it plans to enforce age verification through the European Union, imposing fines up to 6% of their global income for non-compliance.

Global Trends in Youth Digital Safety

Denmark’s move follows a similar initiative in Australia, where a ban on social media for children under 16 was enacted in December. Platforms like TikTok, Facebook, Snapchat, Reddit, X, and Instagram face fines of up to $33 million for failing to prevent underage users.

Globally, governments are seeking ways to mitigate the negative impacts of online technologies without stifling their benefits. China, for instance, has imposed restrictions on online game and smartphone time for children. France is currently investigating TikTok over allegations of promoting content related to suicide and potentially encouraging vulnerable young people to take their own lives.

The Broader Context of European Regulations

The European Union’s Digital Services Act, which came into effect two years ago, already prohibits children under 13 from creating accounts on platforms like TikTok, Instagram, YouTube, Twitch, Reddit, and Discord, as well as AI companions. Major social media platforms have long stated their services are intended for users 13 and older. TikTok and Meta (parent company of Instagram and Facebook) employ age verification methods, including selfie analysis and AI-powered systems.

Despite these efforts, Stage believes stronger action is needed. “We’ve given the tech giants so many chances to stand up and to do something about what is happening on their platforms. They haven’t done it,” Stage said. “So now we will take over the steering wheel and make sure that our children’s futures are safe.”

The push to protect children online reflects a growing awareness of the risks and a determination to hold tech companies accountable for creating safer digital environments. Denmark’s bold move signals a new era of youth digital safety, potentially influencing policies worldwide.