19-09-2024
LONDON/ NEW YORK: Instagram is overhauling the way it works for teenagers, promising more “built-in protections” for young people and added controls and reassurance for parents.
The new “teen accounts” are being introduced from Tuesday in the UK, US, Canada and Australia.
They will turn many privacy settings on by default for all under 18s, including making their content un-viewable to people who don’t follow them, and making them actively approve all new followers but children aged 13 to 15 will only be able to adjust the settings by adding a parent or guardian to their account.
Social media companies are under pressure worldwide to make their platforms safer, with concerns that not enough is being done to shield young people from harmful content.
UK children’s charity the NSPCC said Instagram’s announcement was a “step in the right direction” but it added that account settings can “put the emphasis on children and parents needing to keep themselves safe.”
Rani Govender, the NSPCC’s online child safety policy manager, said they “must be backed up by proactive measures that prevent harmful content and sexual abuse from proliferating Instagram in the first place”.
Meta describes the changes as a “new experience for teens, guided by parents”.
It says they will “better support parents, and give them peace of mind that their teens are safe with the right protections in place.”
Ian Russell, whose daughter Molly viewed content about self-harm and suicide on Instagram before taking her life aged 14, told media it was important to wait and see how the new policy was implemented.
“Whether it works or not we’ll only find out when the measures come into place,” he said.
“Meta is very good at drumming up PR and making these big announcements, but what they also have to be good at is being transparent and sharing how well their measures are working.”
Teen accounts will mostly change the way Instagram works for users between the ages of 13 and 15, with a number of settings turned on by default.
These include strict controls on sensitive content to prevent recommendations of potentially harmful material, and muted notifications overnight.
Accounts will also be set to private rather than public meaning teenagers will have to actively accept new followers and their content cannot be viewed by people who don’t follow them.
Parents who choose to supervise their child’s account will be able to see who they message and the topics they have said they are interested in, though they will not be able to view the content of messages.
However, media regulator Ofcom raised concerns in April over parents’ willingness to intervene to keep their children safe online.
In a talk last week, senior Meta executive Sir Nick Clegg said; “one of the things we do find… is that even when we build these controls, parents don’t use them.”
Age identification
The system will primarily rely on users being honest about their ages, but Instagram already uses tools to verify a user’s age if they are suspected to be lying about their age.
From January, in the US, it will use artificial intelligence (AI) tools to proactively detect teens using adult accounts, to put them back into a teen account.
The UK’s Online Safety Act, passed earlier this year, requires online platforms to take action to keep children safe, or face huge fines. (Int’l Monitoring Desk)