eSafety has sent a series of questions to Google’s YouTube, Meta platforms Facebook and Instagram, TikTok, Snap, Reddit, Discord and Twitch, through expanded transparency powers under the Government’s recently-updated Basic Online Safety Expectations Determination.
The recently amended determination broadens the areas eSafety can ask industry to report back on and establishes expectations that companies will take steps to keep users safe online, including being transparent when asked questions by the regulator.
The eight companies will have 30 days to provide responses to the eSafety Commissioner and appropriate findings will be summarised in a public report.
eSafety commissioner Julie Inman Grant said the industry body knows that when it comes to keeping children safe online, a multi-pronged approach is needed.
"Imposing age limits is on the table but we also need better information to understand what will be effective, what the unintended consequences could be and we must absolutely support children in building their digital resilience and critical reasoning skills," she said.
“We are having a really important conversation in this country right now about the potential damaging effects social media might be having on our children and our research shows that almost two-thirds of 14-17 year-olds have viewed potentially harmful content in the past year including drug use, self-harm and violent images, but we also know that teens get many benefits from social media."
Inman Grant said that a key aspect of that conversation is having some solid data on just how many kids are on these platforms today, with the range of their ages being is a key focus of the requests, but that eSafety also want to assess age assurance readiness and find out how these platforms accurately determine age to prevent children who are under the permitted age from gaining access, and ensure appropriate protections for those who are allowed on the services.
“Most of these platforms have their own age limits in place, commonly 13, but we also want to know how they are detecting and removing under-aged users and how effective this age enforcement is," she said.
“eSafety research also shows that almost a quarter of 8-10 year olds said they used social media weekly or more often, while close to half of 11-13 year olds said they used social media at the same rate.
“To ensure the safety of young Australians, we need to provide them – and their parents, carers and educators – with effective education and prevention strategies.”
Have something to say on this? Share your views in the comments section below. Or if you have a news story or tip-off, drop us a line at adnews@yaffa.com.au
Sign up to the AdNews newsletter, like us on Facebook or follow us on Twitter for breaking stories and campaigns throughout the day.