The proposed South Australian bill to implement a social media ban for children under 14 will mean unintended confusion, according to digital platform Meta.
And such a move would be "challenging to operationalise for both industry and the proposed regulator" and would be a to South Australian parents and young people.
The proposal by the South Australian government laid the groundwork for federal government to also introduce legislation to enforce a minimum age for access to social media.
In a response posted online overnight, the technology company said it shared the intent and goals of the South Australian government in providing a safe and age-appropriate online experience for teens.
However, it believes the bill can be amended to best achieve its goals and be a "world-leading example of age assurance legislation".
Meta said understanding a user’s real age is key, but as the bill adopts an "app by app approach" to age verification, this means every provider will have to identify how they will comply, placing a burden on parents and young people to prove their age and parental relationship with each of the dozens of apps used.
"Teens move fluidly from one app or service to the next, and any regulation in this space needs to reflect how parents and teens actually use apps. It also needs to apply to the ever-evolving nature of these technologies, so that companies can implement requirements for the long term," the company said.
"By placing the obligation to verify age on each individual app, this increases the time required for South Australian families to follow the age verification processes for each app and also potentially increases the privacy risk to them, because they will need to share some form of personally identifiable information with each app to confirm the age of the young person and the parental relationship on an app by app basis.
"A more effective and simple approach for the Bill would be to adopt a ‘whole-of-ecosystem’ approach that requires app store/OS level age verification and app stores to get a parent’s approval before their child downloads an app, allowing parents to oversee and approve their teen’s online activity in one place."
Meta said that teens and parents already provide companies like Apple and Google with this information and these companies have already built systems for parental notification, review, and approval into their app stores.
"Legislation should provide an overarching framework to this existing practice, require app stores to verify age and then provide apps and developers with this information, which can then be used by app providers as part of their individual age assurance tools," the company said.
Meta’s investment in User Age Group APIs in the Meta Quest Store, which are designed to help developers understand how old their users are, was provided by the technology platform as an example of how this can be achieved in a privacy-preserving way.
When someone launches an app on the Meta Quest platform, these APIs allow Meta to share whether the app is used by a preteen, teen or adult account. The app is then able to use this information to tailor a more age-appropriate experience and to properly protect young people’s data.
"An app store/OS-level solution would not exempt Meta and other app providers from implementing their own age assurance tools. Rather, it would be an important complement to these efforts, which recognises the technical limitations of age assurance technology, the practical realities of how young people and parents use apps, and preserves privacy by minimising data collection," Meta said.
"We want to be clear that we are not recommending this approach in order to divest Meta of our responsibility to ensure safe and age appropriate experiences for teens across our services - a narrative that has gained momentum in some circles but is very much misguided. We make this recommendation based on our long experience in building online safety into our products and services."
Meta recently rolled out Teen Accounts for Instagram, with built-in protections that limit who can contact teenagers and the content they see.
The NSW government is hosting sessions today and the government of South Australia will host sessions tomorrow to address the increasing harm online platforms are having on children and young people.
Have something to say on this? Share your views in the comments section below. Or if you have a news story or tip-off, drop us a line at adnews@yaffa.com.au
Sign up to the AdNews newsletter, like us on Facebook or follow us on Twitter for breaking stories and campaigns throughout the day.