Meta protects FIFA players from online abuse

By Ruby Derrick | 26 July 2023
 
Credit: Meta Newsroom.

Meta has launched new policies and features across its platforms to protect women's FIFA World Cup players from harmful online content.  

The tech company has introduced certain policies on its social networking platforms to protect women from abuse while encouraging fans to show kindness and support when interacting with players on its apps. 

Meta’s Newsroom report ‘Protecting Footballers and Fans on Our Apps During the FIFA World Cup, published in November last year has just been updated for the FIFA Women’s World Cup as fans from over the globe connect with the tournament and its players. 

A statement from the post said: "Our goal is for everyone — including women footballers — to feel confident and in control when they open their direct messages."

To allow the FIFA players to feel confident and in control when opening any direct messages on the platforms, Meta began testing new features on Instagram to protect women against unwanted images and videos in DMs.  

Before being able to access and open a message from someone who doesn’t follow them, users must now send invites to gain permission to connect. Users won’t be able to send anything more until the recipient accepts the invitation.  

The report also outlines the 'Hidden Words' feature which will automatically send DM requests, including story replies, containing offensive langue, phrases and emojis to a hidden folder so players don’t have to see them. It also hides comments with these terms under people’s posts.

Since launching Hidden Words last year, more than one in five people with more than 10,000 followers have turned it on, outlines the report.  

Meta is testing turning this feature on by default for people with creator accounts, which includes many footballers playing in the World Cup. 

Limits, another new feature from Meta, hides comments and DM requests from people who aren’t followers of creators and players, or who have only just begun following them.  

This feature is beneficial for those players who receive a sudden surge in comments and messages after a football game, for instance.  

Meta’s research shows that most negativity towards these profiles comes from non-followers or recent followers. When it detects that someone may be experiencing a rush of comments or DM requests, they’ll be prompted to turn on Limits, according to the report.  

In encouraging more supportive behaviour from people, preventing abusive content is high priority for Meta.  

In use already is AI to detect when someone is trying to post a comment that could be offensive and will warn them that it's a breach of Meta’s guidelines.  

The Newsroom report states that in a given week, people edit or delete their comment 50% of the time after seeing these warnings.

The recent introduction of nudges encourages people to pause and rethink before replying to a potentially offensive comment. These nudges are now live for people whose apps are set to English, Portuguese, Spanish, French, Chinese or Arabic. 

Meta stated that it regularly speaks to football players, teams and associations around the world - including FIFA - to make sure they know about its latest safety policies and features, while listening to their feedback.  

It is working closely with teams competing in the World Cup to help players turn on these safety tools and remind footballers in the tournament to check them with a prompt at the top of their Instagram Feed. 

Have something to say on this? Share your views in the comments section below. Or if you have a news story or tip-off, drop us a line at adnews@yaffa.com.au

Sign up to the AdNews newsletter, like us on Facebook or follow us on Twitter for breaking stories and campaigns throughout the day.

comments powered by Disqus