Eight tips for creating great remote user tests on usertesting.com



I've used usertesting.com a lot as a platform for my user testing (funnily enough) in the last few years. In particular when I'm working on improving the UX of websites. I love how quick it is to get results and how you get back honest feedback from users in their real environment. Being this quick it can be tempting to rush through setting up tests but to get good results it needs a bit more care than this.

Also it's a bit different to testing in-person as you obviously aren't there to correct issues that may arise. So here are my top tips for getting the most out of usertesting.com, based on over four years experience using it. I'll be using website tests as examples as that is where most of my experience lies but you can also test apps and prototypes.

1. Just test with five… per platform

The rule of testing with only five users is a solid one that has always stood me in good stead. However, if you are testing a modern website you need to consider the different devices it can be viewed on, as that creates a vastly different experience. For the average site I suggest testing with five users on desktop, and five on mobile. Splitting your five tests between desktop and mobile means you probably aren’t going to get enough results.

I don’t tend to test on tablets very often as they are becoming quite a small chunk of the user base and between desktop and mobile most of the issues that affect them should be picked up. However do check your stats to see if tablet users represent a big section of your users. If it’s greater than about 15% it could be worth running some tests for them.

2. The right people for the job

You should make sure the users you test with are like the actual users of your site. The demographic selection on usertesting.com is quite lightweight but conveniently the parameters they offer for choosing your audience line up nicely with the data available on Google Analytics so you can test with people similar to your users. Gender, age, device, location can all be found in the audience tab.

To get more specific in finding similar users, you need to make use of the screener questions (available to premium accounts). You can ask your users multiple choice questions and only allow those who answer as you want to take your test. This is helpful for getting users who are interested in your product. Though don’t get carried away here as it’s the usability you’re mainly testing, not so much the intent of whether they would actually use the website’s service.

3. Go step by step

It can be tempting to just ask users to do a single task on your site (i.e. book a holiday, buy a product) and then set your test going. The danger is this results in user tests that only last five minutes and the user rushes to the end, not taking a realistic approach to how they would actually use the site if they were going to purchase. Whilst you only want to cover one user flow per test that doesn’t mean you just need one task.

I generally aim for a task to cover each of the main pieces of functionality on the site, so one search might be ‘compare the results available and choose one, explaining why you chose it’. Usually 6-8 tasks on a fully functioning site is enough to get 15-20 min testing videos back.

4. Be a clear taskmaster

Be careful and clear in how you write your tasks as this is the main interface you have with your users. You’re not actually there to clarify your points so the words have to do the work. Try and summarise each task in short, snappy sentences and try to avoid ambiguity. In the past I’ve made them a touch too wordy and that can instantly throw people as they get confused with what is being asked of them.

It’s worth quickly testing your tasks by reading them aloud or getting someone else to read them. Also if a task relies on the user being on a certain page and you want to be sure users are there, then you can always put the relevant link in the task.

5. Test iteratively

Whilst the usertesting.com dashboard is designed to encourage you to group your users and order several at once, I find it works better to start with one user to test that it all runs smoothly before running it with another one or two. You don’t want to be wildly changing the test between users as the results won’t be meaningful. However it does give you a chance to tweak any confusing wording that may have crept in or spot any technical issues with the site that are preventing any actions.

6. Post-test questions

The platform gives you the ability to ask your users four questions after they’ve completed the test. This is a handy place for gathering a bit of user research, especially if you’ve screened your users to be like your own customers. By default it gives you questions around the user’s likes and dislikes but this often duplicates what you should have been able to tell from watching the test.

You can try asking them questions that can help form broader research, like ‘what features are most important to you when buying a product like this?’ or ‘what other competitor services have you used?’

7. Have an analysis process

It's important to have a process for analysing your tests. Don't just watch them and try to remember the things you need to change. There are a few handy tools in the usertesting.com arsenal for helping here in the form of notes and clips.

The first time I watch each video I use the annotation feature to make timestamped notes relevant to each point in the video. Once I've watched them all I then make a second pass where I note common issues and observations in a Google doc that acts as my report. Once I’ve know what these issues are, I make a clip to illustrate each issue, which I link to from my report.

8. Note dispassionately

When it comes to actually writing your notes it’s tempting to put solutions in there, i.e. 'a bigger button would solve this'. However you're getting ahead of yourself and jumping to the first answer that comes to mind. Just stick to highlighting what happened and what you observed. Tackle solutions when you come to design improvements.

It's also easy to only focus on the broken things or issues with the product but cover the positives as well. You want to remember what users liked so you don't remove the bits that are working.

Last updated on 24 February 2016

ux / product / web design / app design / user testing / design / users / usability / interfaces / web / remote /

Just email from me. Unsubscribe at any time.
Free UX design guides

Related articles

A guide to my full evidence-based UX design process

A step-by-step guide to my process and the tools I use at different stages when running evidence-based UX design projects for clients.

How to do a complete mobile guerrilla user test in under half a day

Here's how I carry out an end-to-end guerrilla user test, including preparation time, running the test, and importantly analysing the results. In under four hours.

Running remote desktop user tests for free

How to remotely user test websites without special software and with no budget to spend.