Nine Entertainment takes AI on ‘trust’

By AdNews | 18 July 2024
 

Nine Entertainment has told staff the use of AI in its business is about trust and responsibility to audiences.

"Like all businesses we’re deeply engaged with the rapid development of artificial intelligence," according to the memo at the media group.

"There are many ways AI can help make our business more efficient, and our content easier to produce and distribute, but we recognise AI also presents risks.

"In particular, the journalism we produce across publishing, television, radio and digital platforms is required to be accurate and fair.

"It has always, and will continue to be, the role of our people to take responsibility for that accuracy and fairness.

"As we work through the myriad ways AI can be integrated into our operations, all our decisions will be guided by our values. The technology may be changing, at great speed, but our values remain the same.

"Audiences trust us because they trust our people. These principles will continue to evolve as the technology and business models associated with it continue to evolve." 

Nine's principles for AI use:

  1. We start and end with humans. Our people take responsibility for their work, including the journalism and content we produce.
  2. Our checks. AI models can be biased and they can hallucinate (make things up). Acknowledging this, we critically examine AI-generated output and automated decision-making for accuracy and fairness.
  3. Our transparency. We are transparent with consumers about the use of data for AI, and provide reasonable declarations when AI has been used to reformat content.
  4. Our content, data & models are ours. We build, train and tune models in a closed Nine environment, ensuring the models and data are protected, secured and confidential. Where this involves use of third party platforms, we will ensure their environments provide the requisite protection. We may in the future enter into commercial agreements with Large Language Model platform owners and other software vendors to licence the use of our content.
  5. Our processes. When we are developing or implementing an AI tool or model for internal use, the people who will use it will be involved in the testing, training and trialing of the technology.

 

Have something to say on this? Share your views in the comments section below. Or if you have a news story or tip-off, drop us a line at adnews@yaffa.com.au

Sign up to the AdNews newsletter, like us on Facebook or follow us on Twitter for breaking stories and campaigns throughout the day.

comments powered by Disqus