The Evidence-Based UX Design Guide

Net Promoter Score

What you can learn

Net Promoter Score (NPS) is a popular method for measuring customer satisfaction with your products or service and is used across many industries, allowing for comparison between very different businesses. It is a one-question survey (although sometimes combined with other questions) asking 'How likely is it that you would recommend [brand] to a friend or colleague?'. The user is given a scale of 0-10 to answer. All answering 9 or 10 are considered promoters, those answering 7 or 8 are scored a neutral, and those answering 0-6 are detractors. The percentage of detractors is taken away from the promoters to give a score between -100 (bad) and 100 (good).

This simple score is seen as a good metric for understanding whether your customers like your service because they are being asked whether they would put their neck on the line and promote you to their nearest and dearest. It's a pretty good indication of how you compare to competitors (if you can get that data) and tracked over time it can tell you whether you're improving or things are getting worse for your customers.

It does have its flaws (covered below), the main one being that it is just a single metric measuring quite a specific question. On it's own it doesn't tell you much, but like most quantitative metrics it can give cause to investigate further if things change. At the very least there should be an extra field asking why the user gave that score so they can give a bit more detail.

How to do it

There are a few ways that companies tend to ask this question of their users, the main ones I've experienced are:

When using NPS data to inform my design process I would keep track of the scores in a spreadsheet with a column for the written feedback to go alongside it (when the feedback covered the digital products I was working on). I would pay more attention to this than the overall score.

Individual pieces of feedback aren't much use so as the spreadsheet grew I would create another column where I would categorise the users' feedback; for example 'search filters', 'product images', 'sign up'. I'd then keep count of how often these issues appeared and could focus attention on those that caused the most problems. If they were highlighting issues I'd seen in user testing then this gave an idea of scale, so every now and then it was worth referring to.

Watch out for

On it's own it's a very broad metric: what exactly is it that people would recommend? Could be anything from the app to the customer service. Different users are going to have different journeys to each other. It's worth knowing who the question was asked to as that will skew the results.

Even if the written feedback is more useful than the score, just because someone has written something about not liking a part of the site, doesn't mean they couldn't use it. They may have just felt like they had to write *something*. Without some user testing video evidence of them using the site it's hard to know how problematic it was and what to fix.

A single piece of NPS data without a baseline score to put it in context is a bit meaningless. Track it for a few months so you can see what represents a good or bad score for your company.

Does the score mean the same thing in different countries? 8/10 might be very positive for some people but it's counted as 'neutral' in NPS terms.

Example tools (and cost)

You can just use your normal email provider to send out the question to users or Google forms (free), Wufoo (free & from £12/mo), or Survey Monkey (free & from £26/mo) to do something more comprehensive.

There are purpose made plug-ins such as Delighted (free and paid) or Promotor.io (from $50/mo).

How long does it take?

Setting up the method of gathering will probably take about half a day. Checking the results relevant feedback should be a job that takes about 30 mins per week.

How often should you use it?

Often

Sometimes

Rarely

Resources

Last updated on 1 March 2017

Here are the tools you need

Never shared. Unsubscribe at any time.
10 tools for Evidence-Based UX Design

Here's another method

Analytics Dashboard

Build your own dashboard to track important metrics and get an early warning when things go wrong.

Learn more | Last updated on 25 March 2017

View all the methods in the guide

Note: the examples in this guide are for website design, but most of the content is also applicable for native apps and software.