Social media companies need to be ready to ramp up safeguards for children​ - Euan Duncan

​​Euan Duncan says social media platforms hosting children need to reassure parents by improving safety features

Concerns about excessive social media use have always been a worry for parents, but they’ve become even more urgent with Ofcom’s recent findings that “many children are spending six, seven, eight hours a day on social media, often more”.

As the media landscape evolves, with AI and tools like ChatGPT leading the way, we are witnessing an exciting time for digital exploration. However, it raises a crucial question: what steps are social media giants like Instagram and TikTok taking to protect children online?

Hide Ad
Hide Ad

In a recent update, Meta addressed these online anxieties by introducing 'Teen Accounts' for Instagram users under 18, featuring enhanced safety tools like default private settings and parental controls, including daily time limits.

Social media businesses face an ongoing challenge: how to balance the drive to maximise screen time for profit while ensuring young users’ safety (Picture: stock.adobe.com)Social media businesses face an ongoing challenge: how to balance the drive to maximise screen time for profit while ensuring young users’ safety (Picture: stock.adobe.com)
Social media businesses face an ongoing challenge: how to balance the drive to maximise screen time for profit while ensuring young users’ safety (Picture: stock.adobe.com)

These changes, aimed at protecting children from online risks, will roll out in the UK, US, Australia and Canada, with plans to expand to the EU later this year. While the measures primarily target users 15 and under, 16 and 17 year-olds can opt out of these features without parental consent.

Meta’s moves have been welcomed, but many argue the changes haven’t gone far enough. Growing scepticism around self-regulation by social media companies led to the UK's Online Safety Act 2023, which focuses on protecting children from harmful content. Set to take effect in early 2025, the Act will clarify platforms' responsibilities, with Ofcom issuing relevant codes of practice.

Meta’s announcement followed Australia’s plans to introduce age limits for social media use, likely between 14 and 16. While Meta denies any link, it appears platforms are bracing for global regulations focused on child protection.

Hide Ad
Hide Ad

UK Technology Secretary Peter Kyle is closely monitoring Australia, stressing the importance of enforceable age bans. This comes as age verification technology faces scrutiny over potential workarounds, such as VPNs, and data protection concerns related to collecting children’s ID. Another proposal, the Safer Phones Bill, seeks to raise the “internet adulthood” age from 13 to 16, requiring parental consent for younger users. This could reduce data collection and limit exposure to addictive content.

Euan Duncan is a Partner, MFMacEuan Duncan is a Partner, MFMac
Euan Duncan is a Partner, MFMac

It’s important to recognise that government action on child safety online extends beyond legislation. In October, 14 US states sued TikTok, alleging it fosters addictive behaviour in children, potentially leading to serious psychological and physiological harms, such as anxiety, depression, and body dysmorphia.

This underscores the ongoing challenge for businesses: balancing the drive to maximise screen time for profit while ensuring young users’ safety. Meta’s introduction of parental controls, including screen time limits, can be viewed as a response to increasing legal pressure.

In the same month, a UK school became the first to ban smartphones during the school day, and Meta introduced its Teen Accounts. Both the UK and Australian governments are closely monitoring these developments, with Australia planning to introduce age restrictions for younger teens.

Hide Ad
Hide Ad

Technology companies must recognise the complex challenge of safeguarding children online. The public demands stronger protections, and Meta’s Teen Accounts are part of its mission to empower parents and protect young users. Platforms hosting child users should reassure parents by improving safety features.

However, companies should also expect tighter regulation, as governments face pressure to act. Ofcom’s Chief Executive, in line with the Online Safety Act 2023, has said social media platforms are responsible for keeping children safe, with the regulator ready to intervene.

If the UK follows Australia’s lead on age restrictions, social media platforms will need to ensure compliance with age verification regulations.

Despite the overwhelming attention on this issue, one thing is clear: social media companies must be ready to improve protections for children.

Euan Duncan is a Partner, MFMac

Comments

 0 comments

Want to join the conversation? Please or to comment on this article.

Dare to be Honest
Follow us
©National World Publishing Ltd. All rights reserved.Cookie SettingsTerms and ConditionsPrivacy notice