What I Learnt About A/B Testing by A/B Testing my Landing Pageby@donald
106 reads

What I Learnt About A/B Testing by A/B Testing my Landing Page

by Donald NgJanuary 3rd, 2020
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

The landing page of a SaaS is one of the most important aspects if not more important than the application itself. A/B testing is an experiment that involves two or more of a given test subject with each presented to a distributed portion of your total audience. Without using a proper tool, you will need to do everything manually. In this post, I would share about my experience with and without A/b testing on Howuku landing page and how adopting A/ B testing has improved my landing page conversion rates by 60%.

Company Mentioned

Mention Thumbnail

Coins Mentioned

Mention Thumbnail
Mention Thumbnail
featured image - What I Learnt About A/B Testing by A/B Testing my Landing Page
Donald Ng HackerNoon profile picture

In this post, I would share about my experience with and without A/B testing on Howuku landing page and how adopting A/B testing has improved my landing page conversion rates by 60%.

We can all agree that the landing page of a SaaS is one of the most important aspects if not more important than the application itself.
A landing page is essentially the storefront of your SaaS and if you did it correctly, it would pay off dividends by acting as a self-promoting machine to your audience.

Therefore, it is really important to keep your landing page content appealing to your audience and selling them the idea to your SaaS.

If you’ve been running a heatmap report for your landing page you would know that most visitors do not scroll past the headline of a website. That means it is really crucial to pick your words carefully in order to give your audience a reason to scroll down to find out more. Otherwise, it might have just become your last words to visitors.

(Spoiler alert: this is really bad for conversion)

I believe everyone who runs a website has been through the phase of constant refining website copies hoping to find the one best resonate with your audience pain point and giving them a reason to stay a little bit longer to eventually convert into a paying customer.

So without using a proper A/B testing tool, you will need to do everything manually. From updating the experiment from time to time to keeping track of the conversion rates while recording the result somewhere else (probably in an Excel sheet), and analyzing the collected results.

But what is this A/B test?

At its core, A/B test is an experiment that involves two or more of a given test subject with each presented to a distributed portion of your total audience.

  • Control: This is the original copy of your landing page.
  • Variation: Since you cannot assume that the control version is optimal, you need to create new versions that contain a distinct variation to find out if these variations could outperform the original.

A/B testing would determine which of your test variation are performing the best depending on each of their conversion rates.

Ideally, you should run test on one simple component at a time (e.g. headline, main CTA button, one marketing copy), as this would eliminate the problem of too many factors and interfere with the identification of the effectiveness of tested variations.

Manual A/B testing

Before I was introduced to A/B testing, I would go around gathering inspiration for marketing content and copies and start testing them out on my website and see if they actually worked.

By testing I mean, I would keep tracks of each tested variation conversion rate with an excel file. These tests were usually a simple change on the headline of my landing page.

The process usually goes like this…

  • Control: Starting with my original headline after been running for a month or so was seeing a modest conversion rate at 5%. To further optimize my conversion rate, I shall create a new challenger 
  • Variant 1.Variant 1: Spin up the headline with a value-oriented statement and running it for 2 weeks I am getting a conversion rate of 12% which is pretty good compared with the original copy. Ideally, we would retire the underperformed and keep trying to refine our test to achieve higher conversion rates. Now we are going to add a new challenger Variant 2.
  • Variant 2: quickly comes up with a new headline to test and let it run for another 2 weeks duration. Was it perform any better? NO, it was actually performed the worst out of all 3 with a merely 3% conversion rate!

My initial thought was that it is not a big deal as we can always go back to the best performing copy, right?

Plot twist: My website conversion rates did NOT improve and I always resorted back to Variant 1!

I quickly came to the realization that it was more than just a headline change that was happening at the given time frame, there were also other variables I did not account for, such as quality of traffic, the sample size of conversion, and other unexpected events.

Problems with Manual Testing

During a different period of my tests, I might have been experimenting on other growth methods like:

  • Experimenting on Google Ads campaign by changing my targeted audience or Ads copies
  • Startup community promotion such as Indiehacker, Betalist or Product Hunt.
  • Receiving some traction from some SaaS groups, Reddit, or etc.
  • Other alternative growth methods such as experimenting on Bing’s free Ads credit

I realize those tests were each running on a different time period hence the final conversion rate has varied accordingly to unexpected events at that particular time.

The hypothesis is that tests should run in parallel to see accurate results coming from the very same targeted group of audiences and this way we can eliminate all the variables of different events, source, and quality of visitors.

The Pain Points of A/B Testing

I was planning to build one for myself because, as a cheapskate, I thought:

“I am a developer, why not build one myself.

How hard could it be?”.

Also, it would be a great complement to my current stacks in Howuku to offer a complete CRO solution.

So I talked to a few CRO experts to see if they have any pain points and whether I can provide value over and above the existing A/B testing tools in the market.

I’ve come to learn that the price of A/B testing tools is relatively costly.

One CRO manager has told me that: 

“Cost is a limiting factor. I use Optimizely but I’ve heard it’s quite expensive”.

I looked around the internet and found Optimizely to be the golden standard of enterprise-grade A/B testing tools.

My curiosity brought me to the Optimizely website trying to find out their prices but they were not very transparent about their prices.

So, I dug deeper and went on to see other people’s reviews on it, and to my horror, I found it would cost a whopping $50,000 per year as the minimum entry price!

“Wait a minute, I can do that!”, I said to myself.

So I spent a month working on it and come out with a working MVP to carry out my hypothesis on my Howuku landing page.

Running an A/B Test

I’ve set up my first test to find out how effective it would be to just change one line headline to my conversion rate (sign up). I am running an A/B/C test and the following is my setup:

  • Control: A better way to understand user behaviors and better UX
  • Variant 1: Create a high performing user experience for website
  • Variant 2: The easiest way to understand your website visitors

All incoming visitors to my landing page have a 33% pseudo-randomized chance to see each of the headlines stated above.

After a month of experimenting with A/B testing, Variant 1 is winning the competition convincingly and am really amazed to see such results.

If I were to make some assumptions as to why the result would turn out this way, it would be that Variant 1 is a more value-oriented statement.

Visitors were able to see the end goal or benefit of using my service which is to create a website with a really good user experience that can convert more users into customers.

On the flip side, the value proposition is not clear with Variant 2 and it does not explain why should visitors care?

The Control is somewhere in between of the two, but the statement itself feels very long and is not as straightforward as Variant 1. Hence it got a mixed result which is neither bad nor is it an optimal performance.


The A/B testing result has been a great success for my landing page so far. It is almost magical for me to see how the result plays out and performs in a statistically-significant fashion.

Anyway, I shall let the experiment run for perhaps another month to see if I can get anything new from it.

I would definitely suggest anyone who has a landing page to try out A/B testing and see if you can manage to make some improvement out of your current conversion rates.

Try add a new variant of your website and experiment to see if a slight variation will perform even better.

At Howuku, we have recently launched a free A/B testing tool that comes with a visual editor to help you quickly spin up a new variation of your website.

Howuku Analytics is a business analytics platform that track user interaction via on-site feedback, heatmap, recordings, and landing page A/B testing.

Register a new account with 14 days free trial on any of our plans today!

(Disclaimer: The author is the Founder at Howuku Analytics)

Originally published here