This article first appeared in AdNews in-print. Click here to subscribe to the AdNews magazine or download the digital version here.
As humans, we like to think we're in control; that somehow we're in charge of where we go, what we see, what we do and who we meet. This is complete bollocks as our lives are now run by something called ‘the algorithm’.
It was designed to serve up appropriate advertising and reduce marketing wastage, but as Moore's Law doubles the capacity of computing power every two years and AI approaches self-awareness, its capacity to process raw data and predict our behaviour could now mean it will anoint the next US president. Take my hand as we go down the rabbit hole, because this shit's getting real.
Let's go back five years. The King's Speech has won Best Film and Borders has filed for bankruptcy in the US. Like most great bookshops, it was a celebration of the written word in its myriad forms. Its vast range meant your purchases were often beautifully random. But that doesn't help the marketing machine so it had to be replaced by something more quantifiable. Amazon soon taught us browsing was a waste of our precious time and choices should be based on what we liked yesterday and the day before, not by a connection of synapses, memory and emotion.
Facebook has well over 147 markers that make you, you. This digital DNA is constantly growing. It knows the bands you like, the politics you hate. It reads your messages. And if you've wondered how your suggested friends seem to be the strangers you met at that function, you'll witness that our algorithms are talking to each other.
Even friends are categorised and prioritised until some just fade away.
When you take away the dystopian horror of this, what it does is kind of cool. We go to work with the algorithm’s take on what we want to hear on Spotify. Our news feeds have stories that just seem, well, interesting. At night the algorithm serves up quality entertainment on Netflix. We don't have to search. We won't be bothered by movies we might not want to see or by opinions we may not agree with.
We live in a comfortable world of our own making, fed constant entertainment and news mathematically designed to mirror our own world view, no matter how extreme that might be. Psychologists would describe this as a potentially dangerous mix of confirmation bias and positive reinforcement, where one’s opinions are constantly validated.
In our hunger to reduce marketing wastage and serve up quality targeted advertising we've created personalised digital bubbles through which alternative viewpoints are unwelcome. The algorithm is now our teacher, our friend, our bodyguard.
The implications of this are unknown, but one example is the almost impossible rise of Donald Trump. Is he a true reflection of the dark underbelly of US society or is it just that his supporters can't see an opposing world view? If you're fed the constant and unrelenting message that America is broken, maybe that's why you feel the need to make it great again.
Do tech companies have a responsibility to show all sides of a political debate or will it take a xenophobic madman becoming commander in chief of the world's most powerful military before it becomes a necessity? Trump hasn't ruled out tactical nuclear weapons, but if he does smash the button, the algorithm will be fine, buried deep underground in servers capable of surviving a direct hit.
Andy Flemming is creative director at M&C Saatchi.