October 24, 2024
Analytics are critical to understanding the story of your customer’s on-site experiences. Digital experience intelligence (DXI) tools like Fullstory or Contentsquare are great at answering these types of questions and much, much more. Here at CXperts, we like to use insights from these tools as much as we can to generate meaningful site updates and test ideas.
However, some companies may not have these tools installed or the budget to purchase them. Low or no cost DXI tools do exist like Hotjar or Microsoft Clarity, which can provide good insights, but can be cumbersome to set up, especially if all you need is some quick measurements on specific user behavior.
But these DXI tools aren’t your only option to answer questions like “what percentage of users used site search?” or “are users clicking on the static or sticky add to cart button more?”
One really quick and easy way to generate specific analytics data is to run an A/A test.
What’s an A/A test, you ask? It’s a type of experiment where two identical versions of a webpage or element are tested against each other to verify if the testing platform is working correctly.
A/A tests are usually only run when initially setting up a testing platform. But a secondary use case is that you can also set up goals in an A/A test and use the data collected to inform user behavior.
Many A/B testing platforms have some user friendly, visual way of setting up interaction goals that don’t require any tagging or code changes. For example, tracking interactions on a button is as simple as clicking on the element in the Visual Editor in the AB Tasty platform:
What you can track out of the box will depend on the A/B testing platform you’re using. Some may only allow you to track clicks while other tools might have more sophisticated setups. For example, AB Tasty has a plugin that can track if an on-page element becomes visible to the users like when an element scrolls into view. This can be useful if you’re trying to figure out the percentage of users who see an element further down on a page.
Also, depending on your A/B testing platform, you might be able to see how users who performed a certain action on the site go on to perform other actions like an add to cart or even a purchase.
Going back to AB Tasty, it’s possible to filter results for users who performed (or didn’t perform) a certain action in the A/A tests report readout. In the screenshots below, you can see the conversion rate for users who clicked on the global search bar:
In the example above, we can tell that users who clicked on the search bar had a higher conversion rate compared to the overall average. In this instance, knowing that search bar engagement can increase CVR means creating additional tests to increase engagement on the search bar like changing the style treatment, location, or even test how the search bar functions.
To conclude, you’re not SOL if you don’t have a DXI platform; there are ways to get the data you need to validate test ideas and hypotheses. With all that being said, an A/A test is not a permanent solution for generating insights and test ideas. The one major upside to analytics or DXI tools is that it allows you to explore data without a specific goal in mind.
Many DXI tools have session recordings where you can watch real, anonymized recordings of user sessions to identify testing opportunities. For example, our clients who leverage our CRO services (CX Optimization) and analytics/user research services (CX Insights) are consistently able to use the insights generated from CX Insights to inform test ideas.
If you’re interested in learning more about how to use A/A tests or need help in gaining more user behavior insights to drive your optimization efforts, feel free to reach out to us!