A version of this post was published by AdExchanger.
If there’s a case to be made for Nielsen, it’s the fact that it got every single agency and TV advertiser aligned on how to measure a TV campaign. Agencies could not find an angle or advantage by fidgeting with measurement. In the good old days, Nielsen ensured there was order and harmony.
As it relates to streaming TV, it’s err ... different. Nielsen got dethroned before it could even re-establish itself in the world of streaming TV. And whilst likely providing better measurement, there are too many new players (e.g. Samba TV, VideoAmp, Comscore, etc) to drive industry consensus and uniformity. The net result: streaming ad providers operate under very few guidelines and even less oversight when it comes to measuring the impact of streaming ad spend.
This is hurting brands. Whilst the dialogue has mostly focused around fraud and brand safety, there’s an even more common cheat, hidden in plain sight. Amongst CTV ad providers, there is tremendous temptation to make the ROI on CTV look better than it really is (and definitely better than linear). After all, when the numbers look good, there’s a higher chance for a happy advertiser.
Our goal in this article is to unveil some of the (questionable) tactics deployed by programmatic TV ad platforms. Since it will feel like magic, there is no better way to explain this through Christopher Nolan’s movie, The Prestige, and the three acts of magic.
Act I: the pledge for inflating streaming TV performance
It’s actually super easy to overstate the impact of a streaming TV campaign. All that is needed is to lump other channels (e.g. digital display) in the CTV mix. Let’s compare two seemingly similar CTV campaigns (but different providers).
Streaming TV ad platform A allocates 100% of media spend towards streaming ads. Streaming TV ad platform B, on the other hand, allocates 25% of media spend towards display ads across other devices. Keep in mind, display ads are significantly cheaper than streaming ads. Here’s what the campaign metrics might look like:
On the surface, Ad Platform B presents the better. After all, for the same budget ($100,000) the advertiser is enjoying a lower CPM and as a result, many more impressions.
Magic Act II: the turn
By comparing the two campaigns above on a view-through measurement basis, Ad Platform B can also suggest a much higher performance e.g. ROAS. Let’s unpack by first explaining what view-through is.
View-through is a measurement methodology that matches the IP address of the impression with the IP address of the response over a specified time period. Using this methodology, a brand’s streaming TV campaign receives credit simply if the responding IP delivered an impression, regardless of if that user was influenced by any other paid media (hint: display).
And this is how the magic cheat happens. On average, a household has more than 20 devices. As cheap display ads get shotgun across these devices, they will capture credit for sales that are most likely being generated from other marketing channels (e.g. search, social, yes .. even word of mouth). Since these display impressions were delivered as if they were CTV impressions, it makes the overall “CTV campaign” by Ad Provider B look really, really good. Phrased differently, the reality is that those households would have purchased from the advertiser anyway; serving a barrage of display ads ensure that the attribution for it gets siphoned off to a (phony) CTV campaign.
This practice is abundant. When advertisers log into ad platforms, they’ll often see options like Audience Extension, Multi-Touch, and Performance Max. While each of these capabilities sounds appealing, they all represent ways in which ad platforms mix different ad placements in the name of driving stronger performance. What they don’t do is actually drive real business outcomes.
And now, for the Prestige
There’s nothing stopping players in the CTV landscape from conflating streaming numbers in the ways described above. At this stage in the industry’s evolution, providers are largely able to determine how they want to track and present results to advertisers. In this sense, providers have two choices: Wow advertisers with the biggest numbers possible and hope no one asks questions. Or dig in and break things up in the most granular way possible to help advertisers build long-term success through greater understanding.
The Prestige is incrementality measurement. Unlike view-through, in incrementality, a marketing channel only gets credit if it was the sole reason for an action, such as a sale or ROAS. At Tatari, incrementality measurement for CTV is baked into the platform. And there are many other excellent companies whose sole business is to undress the magician e.g. LiftLab or Measured. Through carefully constructed experiments, they will quickly expose this cheating practice.
If your CTV provider (company B?) provides numbers almost too good to be true, consider any of the above. Our numbers may not look as good, but at least they are not veiled in some clever magic.
Streaming performance can easily be manipulated using a bit of dark magic. Don’t get scared and run for the mountains. Talk to Tatari about this issue and how to assess real performance.