Havas Edge

A Tablet with a "how did you hear about us?" survey on the screen

Marketing Attribution Challenges by Channel (and a Solution)

The ability to optimize campaigns relies heavily on knowing how your media is performing. This, of course, relies on proper attribution. In this article, we’ll address common marketing attribution challenges experienced on different channels, as well as a tried-and-true solution to capturing more accurate feedback.

The Challenge With Attribution

Despite direct response advertisement often having clear and unique call to actions (CTAs), it can be challenging to determine the impact of your ads accurately and comprehensively.

Adherence to the CTA (visiting a vanity URL, calling a specific phone, utilizing a specific promo code, etc.) tracking mechanism is considered a direct signal, and is usually easy to tie back to the originating media. However, there is usually a large subset of consumers where demand or interest was generated by the ad, but viewers engage with the brand through other methods, circumventing the CTA and making attribution harder to track. 

This activity that is not captured through direct tracking is referred to attribution breakage. Without accounting for attribution breakage (which can be a significant proportion of activity) an advertiser won’t have an accurate or complete picture of what the media is driving and how best to optimize. Especially across different channels.

Therefore, attribution breakage should be accounted for to effectively optimize your campaigns.

A Background on Direct Attribution

Before exploring ways to address marketing attribution challenges, it’s important to level set on terminology and understand the different components to attribution.

Direct Tracking Measuring demand through adherence to a specific call to action (CTA).

Within each ad or influencer integration should be a call to action. A CTA is the prompt for immediate response and encourages viewers to act in a specific way, such as visiting a website.

To help with sourcing of the response action to media exposure, vanity URLs should be communicated within the CTA.

Vanity URLsCustom web addresses that individuals or businesses create to direct users to specific pages on their website.

These should be specific to the media that it is run on. More granular tracking allows for better optimization.

Although direct tracking does not capture all the activity that is generated by our ad campaigns, it’s very helpful to see how individual placements or tactics are performing compared to one another. It allows for more granular performance breakouts that are often not available through indirect methods.

Examples of Unique Direct Tracking

There are several factors that impact the strength of a direct signal, but the biggest driver is typically a promotional offer or something to incentivize a viewer to follow the CTA. Without a compelling offer, advertisers must more heavily rely on indirect signals to gauge channel performance.

Indirect TrackingMeasuring overall demand generated from the ad campaign.

As mentioned before, direct tracking will not capture all the activity generated by ads campaigns, so indirect methods can be utilized to supplement direct tracking and help gauge overall demand and activity.

Probably the most common indirect tracking method is something called lift over baseline. This means observing the baseline activity levels that existed prior to the media campaign(s) launching, and then when the campaigns launch, attributing any incremental activity to the campaigns.

Often, control or holdout markets (with no exposure to the media) are put in place to account for any organic fluctuations within the business (either increases or decreases in activity that would have occurred regardless of media presence). In these instances, changes over baseline in test markets would be compared to changes over baseline in the control markets, and the difference between test and control would be attributed to the media.

Lift over baseline is an effective way to determine holistic impact of media campaigns but can be challenging when there are multiple media channels being run. Also, when the existing baseline is high (or not stable), it can be difficult to get a read on performance. Thoughtful test design is required to help ensure that there are actual findings from the campaign.

If both direct and indirect tracking mechanisms are setup for an ad campaign, then direct and indirect signals can be compared to determine the level of attribution breakage (amount of activity that is not being captured via direct tracking).

Marketing Attribution Challenges by Channel

Direct tracking is straightforward for certain channels like TV or radio. Spike, spot time, or pixel attribution is typically used to measure attribution breakage and supplement the direct signal. Here’s a quick overview of each:

Spike AttributionIdentifying the cause or source of a sudden, significant increase (spike) in website traffic or engagement.

Spot Time AttributionAttributing specific events or marketing initiatives to changes in website traffic or user behavior during a particular period (spot time).

Pixel AttributionEmbedding a transparent image (or pixel) into a web page, email, or advertisement to track user interactions with online advertisements or web content.

These methods of attribution don’t work equally well across all forms of media. We’ll explain why through the lens of each channel.

Radio Attribution

Radio is often consumed while commuting, so even if demand was generated from a radio ad, often the consumer cannot respond immediately. This makes spot time attribution less effective because response times are less correlated with when the ad ran, and response spikes are flatter. Also, since more time usually passes between when the ad ran and the response, the likelihood of remembering the CTA is lower, so there will likely be more attribution breakage.

Television

Spot time attribution is much more effective with television advertising because the consumer is more likely to be in a position where they can respond immediately. This results in larger activity spikes occurring shortly after the ad runs. This immediacy, as well as the fact that the CTA is visually displayed on screen, also makes it more likely for the consumer to remember the CTA. This strengthens the direct signal and reduces attribution breakage.

Podcast Attribution

Spike attribution is not effective to capture attribution breakage for podcast advertising due to the on-demand consumption of podcast media. Consumers download or stream podcasts on their own timelines and advertisers do not have insight into when the media is being consumed (or if it is consumed at all). Therefore, there are typically not concentrated activity spikes that can be tied back to podcast ads. The direct tracking signal is still viable, but advertisers need alternative ways to measure the attribution breakage.

Originally, a pixel solutions (identifying consumers that were exposed to the media and later converted on the advertiser’s site) could not be leveraged for podcast attribution due to limitations in how the media was being delivered. However, over the last couple of years, pixel solutions have been developed to tie back the download of podcast to activity on the advertisers’ website, which (in theory) can account for attribution breakage.

Two main marketing attribution challenges related to pixels are that:

  • They do not track ad exposure for podcast: Pixels can determine if a podcast is downloaded, not whether the consumer is exposed to the ad. This is important because many podcasts are auto downloaded and never get listened to. This can present a high rate of false positive matches and/or over-attribution.
  • It’s hard to determine appropriate lookback window: Since podcast consumption is on demand, it is difficult to determine the appropriate lookback window. A lookback window is the amount of time from media exposure (or in this case podcast download) to activity on site. For many digital channels, such as online video, display, etc., the lookback window is set to 7-14 days from media exposure. Since for podcast, the lookback window begins at podcast download not media exposure, a longer lookback window of 30 days is typically used to account for latency between podcast download and podcast consumption. As you can imagine, results will vary dramatically based on what lookback window is used. Often times, the volume attributed ,and therefore the perceived success of a campaign, is dependent on the lookback window used. 
Influencer Marketing

For influencer marketing, attribution challenges are often lower, and the direct signal is stronger given the consumption behavior; users are typically engaged and consuming through devices such as phones, where the action step is just a click away.

The on-demand consumption makes spike attribution challenging. Additionally, social media platforms will not allow pixels on in-content ads. Lift over baseline is a viable option, but the signal becomes muddied if there are multiple channels running.

Digital Advertising

Traditional Digital media also faces marketing attribution challenges. Pixel attribution is very common for digital advertising, but attribution breakage can still occur if media exposure occurs on one device and conversion occurs on another device. Attribution partners try to account for this by using device graphs, but this does not always capture everything. Also, evolving privacy regulations may hamper the effectiveness of this type of attribution in the future.

On the flip side, pixel attribution may over-attribute to digital channels. Within a 7-day period, a consumer is likely exposed to multiple ads on different platforms, and all of those platforms will take credit for the activity. This duplication of conversions leads to inaccurate optimization, so there is the need for additional performance signals.

The Attribution Solution

A post-purchase “How Did You Hear About Us?” survey can be a simple yet powerful tool that accounts for attribution breakage, duplicative data, and data needed to optimize campaigns. It can assist with offline attribution but also validate digital advertising efforts. These surveys are also future-proof in the sense that they are not impacted by changing privacy legislation.

This type of survey is a qualitative way of determining how a consumer heard about the brand or product. It can be used to attribute activity back to an originating media and is a fair way to help evaluate the multiple, disparate performance signals across different channels and attribution methodologies.

At the same time, these surveys don’t have to be the only performance or attribution signal. If possible, it should be used in conjunction with and compared to other methodologies.

Earlier we touched on the difficulty of choosing the appropriate lookback window for pixel attribution. User responses to “how did you hear about us?” can be leveraged to calibrate lookback windows. Rather than simply going off of industry standard lookbacks, survey results can be compared to pixel data to determine more appropriate lookbacks by channel. 

Surveys can also be stood up relatively quickly, making them an attainable solution for marketing attribution challenges.

There are several ways that this survey can be delivered, but it’s typically served post-purchase within the sales funnel. Some common alternate ways are on the company’s website or through email after a purchase. Here are some challenges with these alternate ways: 

  • Site Visit: Can have a detrimental impact on user flow and experience and it sometimes is challenging to track that through to an actual conversion.
  • Post purchase via email or Text: Response rates are typically very low, especially if the respondent is not incentivized to answer. Also, there is more time that passed from media exposure to conversion, so it may be more difficult for the respondent to accurately remember. 

Post purchase, within the sales funnel (at checkout), is preferred because it’s least disruptive to the consumer experience, you know that they actually made a transaction, and typically has higher response rates. Typical response rates for post-purchase surveys at checkout is between 40-60% vs. single-digit response rates for email surveys.

This qualitative data is self-reported, so there is some skepticism regarding the accuracy of the data. But based on extensive historic analysis, if setup correctly, “how did you hear about us?” surveys are reflective of the impact that media is having on the business.

Conclusion

There is no perfect solution for marketing attribution challenges. Each medium deserves its own unique and thoughtfully planned out attribution model to help advertisers best gauge results of active campaigns.

At the same time, “how did you hear about us?” surveys are an often-overlooked tool that, if used properly, can work in conjunction with other methodologies to corroborate performance signals. Not only are they helpful in cross-channel measurements, but they’re directly representative of the voice of the consumer.

About the Author 

Brian Kim is the Director of Data Science and Modeling at Havas Edge. With over 14 years of working in the advertising industry, Brian has developed a mastery of data analytics and attribution. His personal mantra (or as he likes to say “battle cry”) is to “manage by the numbers.” Brian’s analytic, numbers driven approach to partnerships has branded him as a source of truth for many of the businesses he works with.

More Edge-ucation