More government regulation is needed to protect Australians from online harm, says a new report from the federal parliament's Select Committee on Social Media and Online Safety.
The committee released its report today which looked into online harms on social media platforms. The report found that the safety of people online is being threatened by individuals who engage in harmful behaviour and conduct.
The harms experienced by victims of online abuse leave “a long trail of trauma and suffering”, according to witnesses giving evidence to the committee.
The report called for a three-part response to the issue, including having social media platforms focus on user safety and enforce their policies, that the government regulate and monitor the sector, and that users understand that while respectful dissent and disagreement is a part of online discourse, abuse isn’t, and it should not be tolerated.
“For too long social media platforms have been able to ‘set the rules’, enabling the proliferation of online abuse,” says committee chair Liberal MP Lucy Wicks.
“The balance of responsibility for the safety of users online, which until recently has been primarily on users, must be ‘flipped’ to ensure that social media platforms bear more of the burden of providing safety for their users," she says.
“To protect Australians, social media companies have to take responsibility to enforce their terms of service, prevent recidivism of bad actors, prevent pile-ons or volumetric attacks, prevent harms across multiple platforms and be more transparent about their use of algorithms.”
The inquiry also looked at how to address individual actions and behaviours online by building on the eSafety Commissioner’s existing education programs and government awareness campaigns to give Australians more information about how to safely engage in online discourse.
Recommendations:
- The establishment of a Digital Safety Review to review all online safety legislation and government programs, with a view to simplify regulations into one framework and make recommendations to the Australian Government on potential proposals for mandating platform transparency
- Requesting that the eSafety Commissioner examine the extent to which social media companies enforce their policies in relation to users experiencing harm, in addition to requiring them to report to Government regarding reducing harm caused by their algorithms
- Addressing technology-facilitated abuse in the context of family and domestic violence, including the recommendation of significant additional Australian Government funding for support services
- Mandating that all social media companies set as a default the highest privacy settings for people under the age of 18 years; and
- Increasing the reach of educational programs geared at both adults and young people regarding online harms, with a focus on the eSafety Commissioner’s powers to remove harmful content and the mechanisms through which victims can report harmful content and online abuse.
The committee held 11 public hearings over more than three months, and received more than 100 submissions from individuals, organisations and government bodies.
Have something to say on this? Share your views in the comments section below. Or if you have a news story or tip-off, drop us a line at adnews@yaffa.com.au
Sign up to the AdNews newsletter, like us on Facebook or follow us on Twitter for breaking stories and campaigns throughout the day.