• User Research

23rd Mar 2016

7 min

Visitor replay tools have around for a while and they have proven their value in the conversion industry. During this article, I will examine some of the things you should be doing with these tools before revealing the elusive HotJar integration solutions to tag recordings with your split test variations for two of our partner testing tools VWO and Optimizely. Being able to easily identify and analyse your variation recordings can not only help identify issues with your test variants, but it can also help with test analysis and ideation for future CRO tests.

 

visitor recording walrus

What are visitor recording tools?

Visitor recording and replay tools allow you to record real users as they use your site. These tools became popular through usability and quality assurance (QA) excercises.  Visitor recording tools became popular amongst conversion optimisers as an insight generator for testing hypotheses. Visitor recordings also add a further qualitative layer to your analytics data and can support other forms of user experience research (such as moderated sessions). Their true value is the ability to provide near real time insight from onsite users, often leading to a leaner testing process.

Though such tools have value, I would recommend using them on a required basis. This is because mining the level of data they produce can be very labor intensive (and costly). One of the best ways to use the tool is to examine areas of high exit identified in your analytics solution to provide insight into “why” this could be happening.

There are many of these companies out there that offer whole suites of tools for conversion optimisers, here are a few of the best and most commonly used (there are many more):

I personally like HotJar, as alongside screen recording it also has voice of the customer surveys, heatmaps, form analytics and recruitment for what I believe to be a very reasonable price. That is the reason I have put the time to curating or developing the solutions below.

How you should use them?

QA

Using these tools for additional and leaner QA can be extremely helpful. Despite everyone’s best efforts, quality issues on a website can always pop up. They may be sporadic in nature and even found in niche user journeys, but that doesn’t mean they should be ignored. Visitor recordings can help you identify errors in processes and fix them. Ensuring your website is bug-free can have a suprising affect on your conversion rate.

Usability testing

Recordings can support your current usability testing. Examining recordings for repetitive or confused behavior can highlight usability issues on your site that can lead to user frustration. Usability is a key component of user experience and any improvements made to it are likely to have positive impact, not just one conversion rates, but on the overall perception of your online experience (and by association, your brand).

Hypothesis generation

Analytics can provide qualitative data and flag targets for investigation. These tools can be used alongside both remote and moderated user research to provide that qualitative information that helps you rationalise test hypotheses that truly make a difference.

Taking it to the next level

 

getting maximum value out of visitor recording tools

 

Its easy to see the value of visitor recording tools as a hypothesis generator, a facility for further QA and as part of your usability testing on live sites. I’d go a step further and say that a tool such as HotJar is now as essential to your optimisation tool box as your testing tool or analytics solution. Unfortunately, most people are not exploiting the tools full potential to validate ideas and learn about customers.

This tool can and should be used to maximize potential learning gained from tests. Implementation of variation tagging can help you understand why you achieved you 10% uplift or why your test has failed.

When should you use it?

I recommend experimenting with some recordings when a test begins. Studying these early users can help ensure that nothing has been missed in your QA process that could impact your results.

Collecting some additional recordings throughout the test can also be very helpful. Imagine for example you have adjusted the prominence, location or style of some onsite functionality. Analytics can inform you about the changes in interaction but to gain more insight into why or how people are using it from your user base in near real time. This information can be used to help you better understand your users, not only for new tests, but also iterating previous ones.

This is potentially more helpful on more innovative tests when perceptions or behavior are likely being altered or challenged, but it can still be helpful to identify sporadic QA or misconceptions on more iterative tests.

How do you do it (with HotJar)?

To allow you to gain further insight the tagging feature is essential. We need to use HotJar’s JavaScript tagging to label user recordings with the variant they have seen. We can then use their filtering tool to look for recordings with these labels in our analysis.  I will show you below the solutions recommended for HotJar from Optimizely and the solution for VWO that I have been working on.

Optimizely

Below is HotJar’s published method for tagging in Optimizely:

nickcode1

This needs to be placed in the code for each variation and the variation name text replaced to suit your needs.

Optimizely have released a more automated, reusable and scalable solution:

optimizely user recording code

This should go in the Experiment JavaScript. More details of this solution can be found here.

VWO

Both snippets below were written to be placed in the post-variation JavaScript this will allow it to collect the necessary test and variation name for variants and the control. They will get the name of the test and variation (given to them by you in VWO) and join them together in a tag (e.g. “MyTest27 Control”).

I have produced this solution for VWO, this should be used when only one experiment is running on the page:

vwo user recording code

Here is a more complex solution for pages that may have multiple experiments on them for example when testing on a responsive page that also has a test for mobile devices running:

vwo user recording code

This will include tags for all running tests on that page that the user is currently included in.

If you are using another recording tool, these solutions should be relatively easy to alter. If the service offers JavaScript tagging API then the code above should be easy to amend to suit your technical needs.

Conclusion

It can be difficult to solve problems that firstly we don’t know about or understand. Using visitor recordings, we can not only discover issues but also use them to understand why they occur. As optimisers, we can then create hypotheses, solutions and tests to evaluate this learning. With the solutions outlined here, we can create a feedback loop that fuels continued optimisation efforts and customer learning.

If you have any questions about the JavaScript tagging code for VWO (or any others relating to visitor recording tools), leave them in the comments or find me on Twitter @nicksadla01