Klaus Enzenhofer

Subscribe to Klaus Enzenhofer: eMailAlertsEmail Alerts
Get Klaus Enzenhofer: homepageHomepage mobileMobile rssRSS facebookFacebook twitterTwitter linkedinLinkedIn


Related Topics: Cloud Computing, SOA & WOA Magazine, Java Developer Magazine, Software Testing Journal, AJAX and ContinuousAPM

Article

What to Do If A/B Testing Fails to Improve Conversions?

How is the end-user experience affected by the responsiveness and performance of the pages

A/B and multivariate testing are often used to improve the conversion funnel. What these tools do is randomly place alternative change images, text or other design elements to gather statistics about how these things affect site visitors. Companies have had great success using such solutions, but sometimes multiple rounds of testing still produce inconclusive data: changing the color or image on their pages did not have a significant impact on the overall conversion rate.

Does this mean that there is no way for that business to improve the conversion funnel? In these cases, marketing and other business stakeholders often think: "We have chosen the wrong images. Let's try some more!" and a new test cycle is started. But I think we should consider other factors. Specifically, how the end-user experience is affected by the responsiveness and performance of these pages. I will explain how to do this in four steps:

Step 1: See the conversion funnel
First, we need to select the page we are interested in and chart how many visitors come to that particular page, as shown in the screenshots below.

Step 2: Count the abandonment rate
As the following screenshot shows, the most visited page is the ‘search page,' which is expected: customers are doing more searches than logins or bookings. Does this fully explain the decrease between the number of searches and logins, for example? Or are some visitors abandoning the site? Next, we need to create a chart showing how many customers abandoned the pages on each step as shown by the "Exit page count" below.

Step 3: See if end-user experience impacts conversions
Now we know that visitors are leaving on certain pages but we are still guessing why they are leaving. This is where the A/B testing solutions come into place. We always think that it has something to do with the images or could it be that poor user experience and lack of page responsiveness is an issue? The APDEX is a standard that is used to quantify this aspect of user experience. It is a unified value between 0 and 1 where 1 reflects a superior experience. Let's test this hypothesis with a User Experience Management solution that captures APDEX scores and see if poor user experience explains low conversion rates. The screenshot below shows the APDEX scores for each step of the funnel. Now we know the poor user experience on the last two steps has an impact on the abandonment.

Step 4: Finding the root cause of the user experience issue
Now we know that changing an image does not solve the problem of our users. We should try to find out what is really making the experience so bad. Is it a bad response time or do we have errors on these pages?

Because we have a User Experience Management solution in place we now have the capability to distinguish whether the problem is an error as in this screenshot or a last step of the conversion funnel or a slow response as on the penultimate step.

To fix the problem with our conversion rate we need to consult the developers and give them the data they need. As we have wisely chosen a solution that captures deep-dive information that includes the full visit, JavaScript errors, click paths, browser version, etc., the developer has all the data that's needed to fix the problem the first time and without having to re-create the issue or wait for it to happen again.

If you're looking for more information about actionable data provided by monitoring solutions, please have a look at my previous blog "Fact Finders: Sorting out the truth in Real User Monitoring."

More Stories By Klaus Enzenhofer

Klaus Enzenhofer has several years of experience and expertise in the field of Web Performance Optimization and User Experience Management. He works as Technical Strategist in the Center of Excellence Team at dynaTrace Software. In this role he influences the development of the dynaTrace Application Performance Management Solution and the Web Performance Optimization Tool dynaTrace AJAX Edition. He mainly gathered his experience in web and performance by developing and running large-scale web portals at Tiscover GmbH.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.