EU targets TikTok and Instagram over addictive design harming children
EU is targeting TikTok and Instagram's 'addictive design' features under Digital Services Act to protect children's mental health
The European Union is targeting social media giants over 'addictive designs' believed to harm children's mental health. This new child protection initiative leverages the Digital Services Act (DSA) to hold platforms accountable for risks to minors.
TikTok is under formal investigation by the European Commission, with President Ursula von der Leyen calling out its 'addictive design'. The probe focuses on features like endless scrolling and autoplay.
Preliminary findings suggest TikTok’s design may breach the Digital Services Act (DSA) by failing to mitigate risks to users' mental and physical well-being.
Meta's popular platforms are also under scrutiny. The EU is investigating whether Instagram and Facebook are doing enough to prevent underage users from accessing their services. Von der Leyen stated the platforms are "failing to enforce their own minimum age of 13," a key part of online child protection.
The EU's focus also extends to algorithms that send young people down 'rabbit holes' of harmful content. The investigation is targeting this risk to minors' mental health, citing videos that promote eating disorders or self-harm.
To combat underage access, the EU is introducing its own age verification app for member states' digital wallets. This provides a robust method for platforms to enforce age limits.
"No more excuses - the technology for age-verification is available," the EU chief declared, putting the onus on tech companies.
This push reflects a global trend, with countries like Australia and France setting social media age limits. Brussels is also planning a future Digital Fairness Act to further target harmful 'attention capture' techniques used by social media to keep users engaged.