
Credit: Brett Jordan via Unsplash
Meta has rolled out new safeguards on Instagram and expanded its Teen Accounts to Facebook and Messenger.
Under-16s will be barred from using Instagram’s Live feature unless they have parental permission.
They will also require parental permission to turn off a feature that blurs images containing suspected nudity in their direct messages.
Tara Hopkins, global director of public policy at Instagram, said these safeguards were developed in response to feedback from parents and academics.
Facebook and Messenger Teen Accounts will be rolled out initially in the US, UK, Australia and Canada.
The social media giant said these protections, which include screen time nudges, message limits and content controls, will create a safer, more age-appropriate experience for teens and bring parents greater peace of mind.
Meta said the Teen Accounts were used by more than 54 million teens globally.
According to preliminary results, 97% of teens aged 13-15 have continued using their default protections.
A recent survey, conducted by Ipsos, also found that 90% of parents believe the protections are beneficial and make it easier for their teens to have positive experiences on Instagram.
A year ago, Meta rolled out Teen Accounts for Instagram, with built-in protections that limit who can contact teenagers and the content they see.
Users under 18 were automatically placed into Teen Accounts, and users under 16 needed a parent’s permission to change any of the settings.
Have something to say on this? Share your views in the comments section below. Or if you have a news story or tip-off, drop us a line at adnews@yaffa.com.au
Sign up to the AdNews newsletter, like us on Facebook or follow us on Twitter for breaking stories and campaigns throughout the day.