Category: UX advice / Research articles

Why you should combine visitor recordings and user testing to understand user behaviour



A vital part of my research into how websites convert (or don’t) comes from watching visitor recordings (also known as session replays) and running user tests.

In the past I only used user testing to understand both usability and how well motivated users were. Then affordable visitor recordings came along and I would spend a long time watching real users, who were obviously truly motivated as they had found the site themselves.

I began to think that visitor recordings could offer everything I needed to know about user behaviour and gave enough clues to usability issues and considered dropping the user tests. The visitor recordings were missing users speaking but after years of working in UX I thought I had watched enough to know what they were likely to be thinking.

However I've continued to do both and a recent project nicely highlighted the different information that the research methods can provide, and the different roles they fulfil.

A detailed checkout step

I did some work for an ecommerce website where users needed to customise a technical product and specify several add-on options that they wanted in page one of the checkout flow. The research on this page threw up several useful findings.

For my research I watched two batches of visitor recordings: one where users successfully got past this customisation page and one where they got no further. I also ran remote user testing with 10 users (five on desktop and five on mobile) who were tasked with getting to the end of the checkout.

If I had only used one of these methods of evidence-gathering I would have missed an important piece of the behaviour puzzle.

The findings

The recordings of the users who abandoned the flow showed that only about 25% actually typed in or clicked on any of the fields on the page. Most of them just looked around and left the page. Meanwhile the recordings of users who completed the page showed that almost all completed the fields with very few problems.

On the other hand, all of my user testers interacted with the fields because they had been asked to complete the task but stumbled across several difficulties in completing the page. They missed fields, missed error messages, and found a lack of information about the add-ons they could choose from.

If I had only watched the two sets of visitor recordings I could have thought that users who weren’t ready to buy would leave but when they were ready they would breeze through. However the many usability issues identified suggested that more likely what I was seeing was a form of survivorship bias from those who completed, missing most of the failed attempts.

Further watching showed that half of those who continued through the flow had accounts and had purchased from the site before. Therefore these users were far from typical new users who the site wanted to convert and weren’t much use to learn from.

If I had only watched user testing then I would have missed something else: I could have assumed users would continue through a page until they hit an actual usability problem but the negative perceptions of the page meant users didn’t give it that chance. The reality was that if a page just *looks* too daunting then most users will just leave rather than attempting anything.

The recordings also showed that about 50% of users who abandoned dwelt on the coupon code field (in this case an offer code was pre-filled here). In the user testing only a couple of the users acknowledged it at all.

The visitor recordings showed the price (and getting a discount) was a huge factor in how interested real users were in completing a form, something that obviously applies less when users have been tasked with completing it and weren’t spending actual money. It can be easy to underestimate how price often overrides everything if you only run user tests.

Capture reality and improvements

Visitor recordings are great for understanding the reality of what users do on your website and show that most users won’t have the patience of user testers. They will give clues to true motivations such as price.

Yet as the speed and ability of people on the web increases they will often either rattle through a page or just leave if they don’t fancy it. This means you can miss the details why certain things are putting off users.

This is where user testing steps in to help you understand the nuance. In addition user testing can be carried out with a broader group than your existing users. This way you can understand how it works for new users you may want to attract, which is essential for any site’s growth.

Last updated on 17 July 2019

ux / product / user testing / usability / visitor recordings / session replays / web / motivations / checkout /

14 tools to gather evidence – guide

Articles on similar topics

An evidence-based framework for redesigning your website

How to gain powerful insights by watching visitor recordings

How to write effective remote user testing tasks (that don’t bias your results)