The process of A/B testing uses statistical analysis to measure ad performance and consumer choice. Originally positioned as a tool used for email communications to test subject lines as well as colour and design, A/B testing has since grown in its capabilities. It is now an important part of all advertising campaigns.
While A/B testing can’t provide a quick fix to engagement problems, it sheds a light on what is and isn’t working. Pouring efforts into the same old marketing campaign may be fruitless if there’s no return and no conversion, and A/B testing helps you understand where to turn your focus.
Conversion Rate Optimisation
In order to optimise and customise ads, tech marketers should focus on conversion rates when evaluating A/B campaigns. For example, did the ad with the illustration gather more traffic and click-throughs than the ad with the stock image? Take note of that variance as well as the medium and file it away for future implementation.
Before you can effectively A/B test and gather conversion numbers, however, you have to evaluate and determine the following:
- On Site Or Off Site? To begin A/B testing, you first need to decide whether you’ll be testing on site or off site based on your goals. On-site testing will include sales tools, documents, and web pages, and off-site tests usually involve ads listed and published through external hosts. It can also be a combination of the two, by first testing ad copy and creative on an off-site platform and then testing the page you’re sending leads to on site.
- Variables. Once you choose your testing location, create a list of all the variables you’re hoping to test. If you’re testing the call to action, for example, you might adjust the location of the call to action, the copy, and the button color.
- Sample Size. After you map out the variables to test, you need to choose your sample test size. Many marketers choose sample size by evaluating daily site traffic and grabbing segments of that traffic for targeting A/B tests. The larger the sample size, the smaller the margin of error and the more positive you can be that the testing results reflect the opinions of that demographic.
Gathering significant data on these variables to guide how you should create ads moving forward may take weeks — or longer, depending on your return of data. It’s a long game to wait and evaluate results, but it can help inform marketing efforts for higher effectiveness in reaching tech users.
“It’s important to understand the scientific method applied to conversion optimisation (or CRO) as well as the statistical analysis necessary to make decisions on your test,” informs Jessica Matthew, director of marketing at Experts Exchange. “Without a clear understanding of these two things, you could potentially be wasting your company thousands of dollars by testing things you shouldn’t be testing, as well as making the wrong calls on winners and losers.”
User Flow Effectiveness
To create ads built for high conversion, you have to focus on making the ad as strong as it can be.
Native ad optimisation lies in good design and copy. Fail to capture attention with the design or highlight value within the copy, and the ad will be overlooked. Native ads face the challenge of naturally blending into the page, so standing out and making an immediate impact is paramount. Work with your team to develop solid design and copy for an unforgettable, eye-catching ad, then A/B test headlines, images, colours, and copy to learn what your audience is responding to.
Optimising display ads hangs on design and placement. Busy ads with too much copy or an image that isn’t compelling likely won’t convert. Display ad trends are currently leaning toward clean lines, simple images, and low amounts of copy. Effective types of imagery — illustration or photo — depend largely on the platform. Facebook, for example, recommends limited copy and an illustration design for higher ad conversion rates. A/B test this theory by providing two Facebook ads, one with an illustration and one with a stock photo. Then test location. A strong ad running on the rail of a web page may perform differently than the same ad running top of the fold. Test locations as well as design style to determine where your audience is looking and where you need to reach them.
The final ingredient for effective online advertising is to test and capitalise on landing page design. As an integral part of the inbound marketing process, and a key ingredient to both native and display ad optimisation, their purpose is to gather contact information from potential leads. If your CTA link from a native or display ad doesn’t take you to a product-specific landing page, your ad is already set to underperform in conversion rates. Likewise, if your landing page sign-up form is buried below the fold or the value of the gated download isn’t immediately apparent, the landing page will not be an effective tool to capture possible conversions. Best practices for landing pages include branding, value in the headline, simple design, and minimal copy.
Once you’ve designed ads and performed A/B tests, your data will help you uncover the results for testing your hypothesis and reaching your goals. What was your hoped-for result? You’ll be able to see, side-by-side, what each ad achieved.
Most tech marketers rely on significance calculators to see if the data they’re analysing has reached real statistical significance — or, rather, whether the results are worth noting. Confidence in testing results is evaluated by certain percentages of success, depending on the method of significance used (i.e., whether you tested a percentage of a population, user group, or want to measure effect size). Dig into the significance calculators your company uses to understand what they are telling you before you implement changes to ads and campaigns.
Popular significance calculator Visual Web Optimizer, for example, helps marketers tweak and personalise both web and landing pages for targeted A/B testing. The tool provides analytics and reporting to decipher results of the customizations. Its visual UI/UX makes it easy for non-developers to make quick changes to their site for testing purposes.
Another tool, Optimizely, is touted as an “experimentation platform”, and focuses on personalising online experiences from initial impact through engagement. This tool claims to minimise the margin for error in statistical reporting by gathering data over longer periods of time.
Leverage tools like these to gain a deeper understanding in whether high conversion rates are based on statistical occurrences and not simply by chance. Statistics of A/B tests, you will find, will help dissect the behaviours and preferences of the tech user demographic. This mathematical insight will help you reach them where it counts with ads that will deliver conversions.