The European Union has called on major digital platforms like Snapchat and YouTube to clarify their measures in place to safeguard children from online risks, with nearly all member states showing willingness to consider restricting social media access for minors.
The EU has established strict regulations governing the digital realm, specifically concerning children’s exposure online. However, there is a growing consensus that more action is necessary.
In light of Australia’s decision to ban social media for individuals under 16, the EU is contemplating the implementation of region-wide restrictions on minors’ access to online platforms. On Friday, 25 out of 27 EU countries expressed support for exploring such measures.
The primary tool at the EU’s disposal to ensure platforms address illegal content and protect children online is the Digital Services Act. This legislation has drawn criticism from the US tech industry, with President Donald Trump even threatening retaliation.
As part of its enforcement efforts under the DSA, the European Commission has reached out to Snapchat for information regarding their strategies to prevent children under 13 from accessing the platform. Similarly, Apple’s App Store and Google Play have been requested to disclose their actions to prevent minors from downloading inappropriate or harmful applications, such as those related to gambling or explicit content.
Specifically, the EU is seeking details on how Apple and Google prevent minors from accessing apps that facilitate the creation of non-consensual sexual content, known as “nudify apps,” and how they enforce age restrictions on apps.
Henna Virkkunen, the EU tech chief, emphasized the importance of ensuring privacy, security, and safety online, highlighting the need for stricter enforcement of regulations.
While the information requests do not imply wrongdoing or punitive action, they can lead to further investigations and potential fines. Snapchat has been asked about its efforts to prevent drug and vape sales on its platform, to which the company assured its commitment to safety and vowed to cooperate.
YouTube, a subsidiary of Google’s parent company Alphabet, has been called upon to disclose details about its recommendation system following concerns of harmful content reaching minors. Google emphasized its existing parental controls and safety measures for younger users, pledging to enhance its safeguards.
Additionally, the EU is examining Meta’s Facebook and Instagram, as well as TikTok, for their efforts in addressing the addictive nature of their platforms for children.
In a parallel initiative focused on child protection, EU telecoms ministers discussed strategies for age verification on social media and enhancing online safety for minors. European Commission President Ursula von der Leyen has shown personal support for these measures, leading to the formation of an expert panel to evaluate potential actions at the EU level.
A majority of EU countries, along with Norway and Iceland, endorsed von der Leyen’s proposal to explore a unified digital age threshold and emphasized the urgent need to protect minors online. Belgium and Estonia refrained from signing the declaration, with Belgium expressing commitment to child protection while remaining open to various approaches. Estonia emphasized digital education and critical thinking over access restrictions.
Denmark is preparing to implement a social media ban for individuals under 15, a move also sought after by France.
