Articles > A guide to my full data-driven UX design process
I’ve written a lot on this website about the techniques I use to be data-driven with my UX design projects. Whilst these articles focus on certain aspects I haven’t written about my full process. In addition my data-driven book covers quantitative and not qualitative data. Thus in this step-by-step guide I will be explaining my approach and tools for gathering data to improve my designs and will link to specific articles for more details.
I don’t always get to do every step outlined here but this represents an ideal project. Often it’s a case of pick-and-mixing the different elements depending on the time and budget available. I’ve used a website as the example but this process would work for designing mobile apps as well. I try and go for lean methods that don’t get bogged down in excessive deliverables and everything here can be done on a small budget.
In most projects I do with a client I will aim to get a decent period of research in up front. This enables me to get up to speed with their product and understand their users, which is vital before I can make informed suggestions about how to improve things. Most of the data gathering goes in at the start of a project, as I need to learn early so I can implement it in my design work.
The first thing to get my head around is how the website currently works and often the quickest way to do this is to speak to the client. This is certainly a form of data-gathering. Unfortunately it’s often the only point of data that many people use and it’s a dangerous one as they can be quite biased.
However the client will be great at explaining what they hope users can achieve on the site (the goals) and all the different features it has, so it’s a quick way of getting up to speed. Get them to tell you what the KPI is for this project you’re working on.
If you're remote this can be achieved through an hour long Skype call. It can also be handy if the client can write it up in a shared doc but it’s always worth having some time to get them to explain things. I then sketch out the key user flows on paper or will sometimes put them together on screen to get their confirmation that I’ve heard them correctly.
Once I have an understanding of how the site works, it’s then time to get to grips with some data to see how users are actually behaving on there. Every company I’ve worked with has had at least Google Analytics (GA) installed, so I use this to drill into the performance of the pages in the user flows and work out conversion rates.
GA offers three types of traffic metric by default: users, sessions and pageviews. You can also set events manually on specific pages or interactions.
If I want a clearer idea of how the site is performing as I design it, I might build a funnel in GA. This may only be suitable for a longer project because it will take time to gather data. Or I might install Mixpanel for this job, as it is better at tracking individual users. As I explain in chapter 2.2 of my book:
The Google Analytics interface gives you a visualisation of a funnel and shows you how many people move forward and how many drop out. Due to the fact that it’s based around sessions, the GA funnel isn’t the most accurate at telling you how your users are moving through your site (Mixpanel is the one for that) but if you want an aggregate overview or can’t install Mixpanel for any reason, GA can do a job.
Mixpanel offers a lot more customisation around its funnels: you can slice and dice them by lots of different properties and it will accurately represent what is happening at a user level. Setting up funnels in Mixpanel is very simple, you just need to have some events set up.
Now I know how the site is performing I can also use Google Analytics to understand who the users are. It’s rare to get a project where there’s time and budget to do a piece of user research and interviewing up front. As a result I need a technique that will enable me to learn about the users without actually meeting them.
For this I delve into the Google Analytics user data. It’s never going to give the insight of interviewing but it is real data about the site’s users and gives a lot of information about their identity. I cover how I create these outline personas in chapter 3 of my book and in this article.
I’m a fan of keeping personas minimal and not filling them with lots of irrelevant stuff: keep it focused on things that will help you design. The kind of data I’m looking for from Google Analytics is age, gender, device, location, source of traffic, new vs returning, and whether they convert or not.
One of the key things to understand at this point is the split by device of users, i.e. how many mobile, tablet, and desktop users there are. This comes in handy deciding which devices to user test on.
Copying the competition might be obvious but a decent critical analysis of similar sites is a useful point of data to gather. I find looking at five sites is enough to get a sense of what users expect. I’ve written about it in this article on competitor analysis:
Along with quantitative metrics and qualitative user tests, competitor inspiration is another valuable source of data. At its heart you are looking for things that work and patterns that users will already understand. Only copying one site is like only having one data point: liable to lead you completely astray.
The final chunk of up front work I like to do is to run a user test. This gives me data to understand why users behave the way they do and why they are—or aren’t—converting. This also acts as a benchmark of how the website is performing before I started work on it. It also helps to prioritise which pages need the most work.
In-person testing can be seen as a bit excessive when testing the current website (as this isn’t the one the client is interested in) so I need to use quick methods. Remote testing offers that speed as well as a way to record the events. There are two remote methods I will use depending on the project.
This involves me facilitating the test with users in a different location. There’s no fancy equipment needed and it’s great for projects where the product is internal or for a specialist audience. I’ve written about how I do it with Skype here.
Once you’ve started the video call, ask them to share their screen with you. You’ll then handily get the view of their screen so you can see how they use your website alongside a small shot of their head, which is handy for checking their reaction to different things (helping you judge if they are happy or confused).
The most common type of user test I run are unmoderated tests using usertesting.com. This is great for getting the real reaction of users as they browse around the web and is ideal for customer-facing sites. I also love the usertesting platform as it records everything and provides videos that can easily be clipped for sharing. I’ve had a lot of practice with usertesting and have written here about how to get great results when setting up your tasks:
Be careful and clear in how you write your tasks as this is the main interface you have with your users. You’re not actually there to clarify your points so the words have to do the work. Try and summarise each task in short, snappy sentences and try to avoid ambiguity. In the past I’ve made them a touch too wordy and that can instantly throw people as they get confused with what is being asked of them.
At this point in the process it’s mostly about wireframing or prototyping. As I’m spending most of my time on this there’s less data-gathering going on, but there are a few things that can be done to support the design work.
Once I understand how the site works and understand the important user flows, part of the process can be designing the client a dashboard of the performance of their site. This is in order to leave them with a method of tracking the important metrics and it’s worth doing during the design phase to reflect any changes in the way the site is designed.
Taking the effort to build a dashboard is one of those things that involves a little bit of upfront work but after that you’re set with an at-a-glance view that you can share with your team. Going to the source and using the API like this unlocks the real power of Google Analytics.
During the design phase it’s possible I’lll come across design issues that cause debate between me and the client. Usabilityhub is a site that offers a way you can solve these potential arguments. It allows you to run quick tests with just jpgs or flat designs, so it’s perfect for testing out any work in progress.
This article contains my advice on how to do design tests, based on studying a bunch of examples. There are a few best practices to ensure good results:
Whilst the preference tests are pretty self-explanatory, the success of a good click test or five second test relies on a well-written setup/instruction before the design is shown and questions presented to the user. Despite being quite simple, getting these bits wrong can really skew your results.
As I get closer to finishing the design work and creating a prototype, I’ll want to user test it early and tweak it rather than waiting for it to be coded. For this I need a quick method that I can run with an incomplete project. I find it’s best to do this in-person as you can explain the context and talk around any unfinished areas.
Thus the best method to use here is a guerrilla test, and I’ve written up my full one-person method here:
Ideally you’d have two people to carry out a test, one to facilitate and one to take notes but that isn’t always possible. So this method works for when you need some user feedback on your product and you’re a one-man band or the only person prepared to do it. It works best for testing on mobiles as you can do it anywhere.
Once the project is finished and the website is coded up, it’s time to see if what I’ve designed is offering any improvements. There are a few things worth doing at this stage and if you want a fair comparison, you should carry them out in a similar way to previous steps.
The most important of them is to do more user testing, as this gives me the quickest sense of whether what I have worked on has improved on the initial user test. It is also the method with the quickest turnaround on results and so we can quickly tweak anything that hasn’t worked.
See above for my thoughts on user testing methods, but generally I’ll be using usertesting.com at this point and following the tips I outlined here.
Of course if I’ve set up a funnel or a dashboard, post-project I’ll be checking in on the key metrics here, usually just once a week. I explain my thinking on this in the book:
I like to split my data by date, in particular by week, to see how the performance is changing over time. Weekly measures are in the Goldilocks zone: a day is too short to learn anything meaningful as there are often big fluctuations by day of the week, a month is too long and you’re leaving it too late to solve any problems, but a week is just right.
Finally, we could try to A/B test the new design against the old. It’s something I used to do when working in-house but it’s something I do a lot less of now I work with multiple clients. Mainly because doing a meaningful test on most sites’ traffic involves running it for a long time. Also because in my experience most A/B tests reveal very little difference between the options.
I have written a basic guide to Optimizely for A/B testing but the more I’ve learned about it and read the latest thinking on the subject, the more I think they should only be run by data scientists. There’s just too much room for error for the average user to misinterpret results.
Sign up to my mailing list and you'll receive my guide to tools for evidence-based design AND my ecommerce UX cheat sheet.
How to go about gathering competitor research properly and an introduction to my new shopping app reviews.
Here's how I carry out an end-to-end guerrilla user test, including preparation time, running the test, and importantly analysing the results. In under four hours.
A retrospective on the long and winding process of writing a design/tech book for the first time and the seven lessons I learned along the way.