The Benefits of Experimental Advertising
Most CMOs understand the benefits of A/B testing — companies everywhere will happily run comparisons on email subject lines, landing pages, website colors, and CTAs. In the realm of digital advertising, however, the experimental nature of marketers tends to be underutilized.
According to a 2018 joint study between Northwestern University and Facebook, only about 12.7 percent of the nearly 7,000 companies surveyed had run an experiment with a randomized control trial (RCT) in the last year. By contrast, nearly 80 percent of companies run A/B testing on their websites, and just under 60 percent have experimented with A/B testing in their emails.
The same study from Facebook and NWU indicated that companies that experiment with their advertising see a revenue bump from their ad campaigns of roughly two percent per experiment run — e-commerce firms that run 15 experiments a year see their ROI increase by 45 percent in that same time period.
This divide can’t be explained by efficacy. With access to the necessary tools and proven results, why are brands failing to improve their testing practices?
Resistance to New Information
People tend to be unreceptive to new information that contradicts their existing conceptions. Take basketball, for example — the three-point line was added in 1979, but as recently as Michael Jordan’s final season with the Bulls, teams were only averaging 12.7 three-point attempts per game. Fast forward to 2019, and the Houston Rockets attempted a staggering 44.8 three-point shots per game.
What changed? The data supported shooting from beyond the arc — teams only have to be 2/3 as accurate to score the same number of points — but it took a data-minded coach like Daryl Morey to fully embrace the implications of the advanced analytics.
Executives and marketers are reluctant to try a new strategy when their current strategy seems to be “working”, even if there is data to indicate that the new strategy is more lucrative or efficient.
In order to make real change, you’ll need to establish a culture of following the approaches suggested by data, even if it takes you into uncharted territory.
Aversion to a Control Group
In order to conduct an RCT, you need a control group. In the case of the Facebook study, researchers served 75 percent of their target audience with an ad while the remaining 25 percent were served the ad from the next highest bidder.
This methodology is necessary to determine the effectiveness of your ads, but it also means that you’re paying not to advertise to 25 percent of your target market. The potential gains from running successful campaigns exceed the 25 percent lost to the experiment itself, but many execs are hesitant to sign off on three-quarters of a campaign.
A significant reason marketers are hesitant to conduct advertising experiments is they overestimate the complexity, cost, and difficulty of doing so. However, today’s digital advertising tools make experimentation easier than ever.
Facebook’s measurement tools enable advertisers to see demographics and other detailed analytics, trace which ads are contributing the most to your campaigns, and run A/B testing on your copy and imagery to find out which one resonates the most with your customers. Most social media ad platforms offer similar tools. Many email clients have A/B testing features as well, allowing you to send multiple versions of the same email automatically.
How to Get Started
If you want to transition your advertising strategy to include experimentation, you’ll want to consider the following:
Get Approval From the Top
Any major shift in your advertising strategy will need approval from the executive level. The data is supportive of experimental advertising, so make your case using empirical data. Position the value in terms of hard numbers — c-suite executives are busy people and tend not to be concerned with the reasoning behind your strategies as long as they’re working.
Create a Data-Centric Culture
Earlier in this article, we mentioned that the Houston Rockets have had tremendous success leaning on the three-point shot — but their strategy would have failed if not for the top-to-bottom embrace of an analytics-based approach by the management and coaching staff.
If your advertising experiments are to succeed, you’ll need to follow the same approach. If the experiments tell you to change your approach, the decision-makers in your marketing department will need to act on those results without questioning them. Marketers and advertisers tend to make emotional decisions at times, building copy and targeting audiences based on what they “feel” will work. For experimental advertising to deliver results, it must be based purely on data.
Advertising experiments require time — you may need to allow several months to elapse between initiating a campaign and seeing meaningful results.
Allowing data to accumulate when you’re making a timely marketing decision can be challenging, however, patience is essential for meaningful insights. The cost of rushing results may outweigh the cost of the experiment itself.
The Bottom Line
Advertising experiments are a significant departure from the typical strategy of many advertisers and marketers, but when executed correctly produce valuable learnings and business development. If you need the help of experienced marketing experts to develop an experimentation strategy, talk to Madison Taylor Marketing. We can help you create a strategy that will take advantage of modern marketing techniques to generate the best return for your marketing dollars.