The UK's data regulator is collecting information about Snapchat's efforts to remove underage users from its platform. This comes after a report revealed that Snap Inc., the company behind Snapchat, had only removed a few underage users in the UK while having thousands of such users, according to estimates.
UK law requires parental consent for processing data of children under 13. Snapchat generally requires users to be 13 or older but has not disclosed its measures to address this issue. The Information Commissioner's Office (ICO) has received complaints and is assessing whether Snap breached rules. Snap could be fined up to 4% of its annual global turnover if found in breach.
Similar pressure has been on other social media platforms like TikTok, which was fined for mishandling children's data. Snapchat blocks users under 13 from signing up their age, but other platforms take more proactive steps to prevent underage access.
Why does it matter?
As social media platforms increasingly become spaces for cyberbullying, inappropriate content, and other risks that could have lasting psychological and emotional effects on young individuals, governments are contemplating measures to protect these individuals. Recently, these platforms have encountered hurdles as they navigate a complex landscape of US state laws. The laws demand age verification from users and seek enhanced parental control over children's accounts. The focus is on both ensuring user safety and ensuring that social media companies uphold responsible practices, particularly regarding children's welfare.