The evidence-based UX design guide

Page Data

What you can learn

In this guide 'page data' refers to the metrics describing user visits to any individual webpage or app screen. A few of these numbers are included in your conversion funnels (users, sessions, and the calculated conversion rate) but this data also includes metrics that give more detail than whether a user was there or not. There are some key ones to focus on, which I cover below (as ever, this guide uses web examples).

This data helps you learn how your pages are performing in relation to each other and can help you understand individual page performance in more depth. Once you’ve got your conversion funnel established, these numbers can form your secondary metrics, i.e. if you make a change that doesn’t improve conversion but it does lower bounce rate, you’ve measured a beneficial secondary effect.

They’re worth studying to build a better picture of user behaviour on your site and can help you define how to do research such as user testing. For example, if you find a key information page has a high exit rate, it should be a task in your next user test to try and understand why.

How to do it

For this I recommend installing Google Analytics, which is by far the most popular tool for tracking this kind of data. Once installed, the following are a few pieces of key page performance data to consider:

Page views—how many times a page (defined as a URL loading) has been viewed. If this is a lot higher than your number of users then you'll know that each user is looking at that page many times. This suggests either the content on the page is so great they keep coming back or that they can't find where to go next.

Unique page views—this is how many separate sessions of browsing a user has had on your page. They may have visited your page multiple times in that session but it would only record one unique page view. A period of browsing is reset after 30 minutes of inactivity. Each session represents a period of intent for a user to achieve something on your site. It shouldn’t be confused with individual users.

Average time on page—the average time a page view lasted. It will depend on the type of page as to whether you want this to be long or short. If it's a long blog article or 'about us' page you'll be hoping for several minutes, whereas if it's a checkout page, you'll be wanting people to whizz through in seconds. If it's the other way around then users aren't being intrigued by your content in the former and they're getting stuck working out how to enter payment details in the latter.

Bounce rate—the percentage of sessions that saw someone land on this page and then leave without visiting another page on your site. This is usually seen as a ‘bad’ metric that you want to reduce. If people are bouncing on a homepage or landing page then there's probably something they weren't enticed by and this is a strong negative sign that you should change something.

Exit rate—the percentage of sessions that saw someone leave your site at this page. Not to be confused with bounce rate, this is a bit more ambiguous, as the user could have visited several other pages before their exit and they may have found everything they needed. After all every journey has to end somewhere. Obviously you'd rather the exit rate was high on pages that appear after a goal (like a post-purchase page). If it's high on a critical page in your flow, it's worthy of further investigation.

Event triggered—this is for tracking non-URL based interactions and tells you whether or not an event has been triggered (such as clicking a button or page element). Whilst it is a binary metric you can attach meta data to each event to give you more details, such as the name and type of button if there are multiple on a page.

Watch out for

It can be hard to find benchmark metrics for what represents a ‘good’ number of users or bounce rate. So take it with a pinch of salt when someone makes a blanked declaration that you should be targeting a certain figure. It’s more reliable to use this data to judge your pages in relation to each other or themselves over time. Use it to help you prioritise which parts need fixing over others and for spotting outliers and problems.

Looking at the raw metrics is a fine starting point but to get more actionable data you need to segment your results. Try segmenting by device or by traffic source to see if users behave differently depending on where they’ve come from and how they’re viewing the page.

Be careful if you’re using regular expressions to track groups of pages via the Google Analytics API and looking at the totals. Due to the way users and sessions are counted there may be some duplication in there because people who visited several pages may be counted as more than one user.

As ever my standard disclaimer with quantitative data applies: it doesn’t tell you ‘why’ something is occurring. If you notice an anomaly such as one page having a very high average time on page stat, investigate further with qualitative evidence rather than making assumptions as to what is causing it.

Example tools (and cost)

As mentioned above, I'd recommend Google Analytics (free) for gathering this data, because it's free and hugely popular around the world (so you'll be able to compare your stats across different sites and find plenty of help guides). If you're tracking something that isn't based on page views, such as a native app, then a lot of the above metrics won’t apply.

How long does it take?

Once your tracking is setup, checking the data for a page takes only minutes. If you regularly track the same few stats then I recommend pulling it into a dashboard.

How often should you use it?

Often

Sometimes

Rarely

Resources

Last updated on 5 December 2016

Get the latest

Never shared. Unsubscribe at any time.
10 tools for Evidence-Based UX Design

Note: the examples in this guide are for website design, but most of the content is also applicable for native apps and software.

How about another method?

Competitor Analysis

How to get design inspiration from your competitors (without just ripping off one of them).

Learn more | Last updated on 8 September 2016

View all the methods in the guide