Parliamentary inquiry finds social media age ban is not the answer

By Makayla Muscat | 19 November 2024
 

Photo by Firmbee.com on Unsplash.

An age ban will not make social media platforms safer without a multi-prong approach to tackling toxic tech, according to a federal parliamentary inquiry.

The final report from the Joint Select Committee on Social Media and Australian Society recommends a Digital Affairs Ministry with responsibility for the coordination of regulation to address the challenges and risks presented by digital platforms. 

The committee also recommended a statutory duty of care by digital platforms, education support and digital competency, greater protections of personal information, independent research, data gathering and reporting, and giving users more control over what they see on social media.

Inquiry chair Sharon Claydon said the committee examined how Meta's decision not to renew deals under the News Media Bargaining Code could influence the provision and consumption of public interest journalism in Australia. 

“The rise in the use of social media across the world has forever changed the way we communicate with each other, socialise, and gain access to news and information,” she said.

“In Australia alone, approximately 81% of the total population were active users of social media in 2023… and these companies are doing all they can to stop users from logging off.

“This report addresses both the need for immediate action, and the need for a sustained digital reform agenda.”

The committee has heard evidence from a range of experts, including researchers, academics, mental health professionals, community organisations, advocacy groups and young people.

Research by the University of Sydney involving 1200 young people found that more than 75% had used YouTube or Instagram and nearly 70% had used TikTok or Snapchat. 

“The typical age at which young people began using social media was late primary school - either with or without their parents' permission,” according to the final report. 

“This aligned with evidence from Orygen and headspace National Youth Mental Health Foundation, which stated that young people aged 12-13 years use an average of three social media platforms, while those aged 14-17 years use four or five platforms.

“Evidence provided to the committee suggested that social media use also varies by gender, with young women aged 14–24 years spending more time on social media than young men of the same age.” 

While Baby Boomers and Gen Xs believe social media negatively impacts the mental health of young people, Gen Z and Millennials report benefits such as feeling more connected to their peers. 

Parents told the committee of their regrets over not having placed more restrictive controls on their children's social media use. 

One mother said her “tech savvy” son was able to circumvent the controls his parents had in place. 

Meanwhile, Ali Halkic, who lost his son to suicide, urged policymakers to help address the damage done by social media and allow children to be children again. 

“In 2018, 450 young kids took their life. That's a school that disappears out of this country every year due to suicide,” he said. 

“How can we expect children under the age of 16 to cope with presence, status, structure? It is not necessary for them to do that. 

“We have taken away so much of their youth. We're so busy in our lives that we put iPads in front of them and we buy them phones. 

“I have the guilt and shame that I contributed to my own son's death. I paid for his phone. I provided the internet. I gave him that computer. I had no idea how dangerous this is.” 

According to Elizabeth O'Shea, from DRW, platforms’ ability to “micro target” content presents dangers for many Australians, with “excessive data collection” enabling targeting by predatory industries such as “gambling, alcohol, diet and junk food products, essentially allowing for the exploitation of vulnerability for profit”. 

Other witnesses told the committee that social media can exacerbate the intensity of bullying and other forms of targeting which leave the child feeling overwhelmed and embarrassed. 

The federal government is currently trialling a number of age assurance technologies in an attempt to assess their effectiveness and performance in protecting users' data. 

The eSafety Commissioner told the committee that it is “difficult but not insurmountable” for platforms to verify the age of its users. 

Meta's global head of safety Antigone Davis said identifying the users’ age helps tailor age-appropriate experiences and determine the appropriate privacy defaults and content restrictions. 

Meanwhile, Snapchat said its platform is not intended for people under the age of 13 and that children under 18 are subject to more restrictive content privacy settings.

However, it remains unclear how TikTok enforces its age restrictions other than the specialist team in their moderation and safety network suspending accounts it suspects are under 13. 

Inquiry deputy-chair Sarah Hanson-Young said the recommendations are intended to empower and educate young people, not punish them. 

She said privacy reforms are also long overdue. 

“If the government wants to protect the safety of young people, they must ban platforms harvesting young people's data and targeting them with toxic algorithms and advertising to make massive profits,” Hanson-Young said. 

“All users must have the ability to switch off or turn down the algorithms that push unwanted content into their feed. 

“Recent research found that platforms like Facebook identify people who are at risk of harm from alcohol and gambling, and target them with advertising, while alcohol and gambling companies share data on those who are at risk to fuel targeted advertising and increase their profits.

“This is a toxic, predatory system, which vulnerable people have no way of escaping without better regulation.”

Hanson-Young said Australia is lagging behind other countries when it comes to regulating offshore digital giants.

“Comprehensive reform will be needed to bring global giants under Australian jurisdiction and enforce their responsibility to make their platforms safe,” she said. 

“While this is a complex problem to resolve, we've seen effective laws in the EU and the United Kingdom that not only make platforms safer for young people, but for all of us. 

“This includes not only regulating tech giants but implementing a tax to ensure these big corporations are reinvesting the money they make off Australians back into our communities.”

Have something to say on this? Share your views in the comments section below. Or if you have a news story or tip-off, drop us a line at adnews@yaffa.com.au

Sign up to the AdNews newsletter, like us on Facebook or follow us on Twitter for breaking stories and campaigns throughout the day.

comments powered by Disqus