For an area that most marketers know little about, big data is receiving a nauseating amount of attention. The marketing opportunity from effective use of data is substantial, however, incompetent analytics work can lead to poor decisions and severely reduce marketing effectiveness. It’s time to move on if your data analytics company does the following:
1. Promises to sharply increase business returns by shifting money between media channels
Channel mix optimisation should always be at the level of audiences, not expenditure. Whilst this may imply that budget needs to be shifted between channels, it is fundamentally different from the concept of optimising the spend mix. The flaw with optimising spend is that the price of media properties constantly fluctuates based on supply and demand, and is inextricably linked to lead-times. Therefore, moving money between media channels or vendors almost never translates into a commensurate shift in the weight of media activity (and audiences reached). Spend optimisation is a bit like throwing a dart at a moving target blindfolded, and hoping it hits the bullseye.
2. Delivers results that defy common sense
Amazingly, this happens! One of the most ridiculous examples I am familiar with involved an analytics company that advised an airline to spend 90% of its TV budget in regional markets, because ROI was allegedly better, even though the airline only served capital cities. Fortunately, common sense prevailed and the budget recommendation was rejected.
Another example involved an optimal monthly flighting schedule (for a sports retailer) that did not prioritise any marketing activity during the lead-up to a major quadrennial event. They were given the boot shortly thereafter.
3. Blinds with science
Analysts tend to hide behind science when their real-world understanding is poor, and they are unable to defend results in open discussion. Even when sophisticated data mining techniques are used to quantify a useful relationship, there is almost always a way to visualise the opportunity in a manner non-technical people can understand and interrogate. It’s only through interrogation that root causes can be identified, and genuine insights unearthed. This trap tends to go hand-in-hand with over-reliance on one particular data source (most commonly sales response rates), without leveraging a broad set of metrics that allow the impact of marketing to be understood throughout the consumer journey. For example, it’s preferable to identify that marketing activity drives online search interest and site visits but not sales, and to then make targeted business adjustments to improve conversion, than it is to walk away entirely, like a blinkered horse, based on low ROI.
4. Makes it difficult to sense check anything
If it’s difficult to validate or sense check results, it usually reflects an attempt by the analyst or analytics company to avoid being held accountable. This is typically done by using indexed figures for response rates, and not including anything that can be challenged or scrutinised in a non-technical manner. A further give-away is the delivery of results in an entirely matter-of-fact style, such as that ‘radio works better than press’, without proper interrogation of root causes. Marketing effectiveness is impacted by many interconnected factors. It’s almost never the case that a simplistic result, such as ‘use radio over print’ will hold true under all different possible configurations of scheduling, creative messaging, pricing, competitive context, etc.
5. Exhibits a poor understanding of real world marketing, media and the digital landscape
Financial analysts usually focus on a specific sector of the economy because good equity analytics requires an in-depth understanding of the industry being scrutinised. Similarly, smart people who are well trained in data analytics approaches still need a solid understanding of marketing to deliver effective analytics solutions in this area. It is usually easy to pick analysts or researchers with a poor real-world understanding of the communications landscape because they often try to hide behind science, and tend to deliver results that defy common sense!
If your marketing analytics company is waving any of these red flags, you should ask: “what is the ROI from investment in marketing analytics”? If the only evidence of a positive impact is that your analytics company claims to be improving returns, it’s time to look elsewhere.
Rob Pardini
APAC Head of Data Planning & Analytics
Maxus