I'm going to tell you why I no longer describe myself as being data-driven and why it’s not the best way to approach design. This might be surprising to anyone who has read my stuff before because a) for over a year I've been describing myself as a data-driven designer b) I teach a workshop on data-driven design and c) I wrote a book on it (read to the end for the fate of this). I used the term in the absence of anything else that described how I work compared to other designers. However I've always been uneasy with it and recently I've come to the conclusion that it's not actually helpful.
Here's why I don't think the phrase works:
I mainly work with startups, and the reality for them is that gathering bullet-proof 'data' (particularly quant) is really hard and time-consuming and in the end may not actually tell us what we want. The design process needs to be much quicker and more pragmatic than for big companies who can implement large-scale data-gathering. What we're really looking for is proof that we have a problem and proof that we've got a good solution. Enough to get something out of the door that improves what was there before and keeps up the pace of iterative growth.
What we want is evidence-based UX design.
There are lots of ways you can gather evidence. Just like in a court case there is strong evidence and weak evidence. Not all is equal but as you gain more sources the stronger your case tends to be.
Quant data may often be one of the most powerful forms of evidence you have but if you're a small company without much traffic then that data will be poor. You can make up for this by gathering qualitative data and even some things you wouldn't think of as data (interviewing clients, customer feedback). Sometimes all it takes is one complaint to know you have a problem. This isn't really being data-driven but it is being evidence-based.
This is why I'm putting together The Evidence-Based UX Design Guide. It will explain all the methods you can use to gather evidence for your design research. It will also rank them by how useful they are and how often you should use them in your design process.
A typical process of mine would touch on 9 or 10 methods. I'll start by talking to the client to get their knowledge; I'll use my experience to assess their site in an expert audit; I'll study live chat transcripts and customer feedback to find pain points; I'll look at audience data and conversion quant data; I'll probably run a user test to see why there are problems; I'll assess competitor sites to get inspiration; as I design I'll get feedback from potential users on the work-in-progress; I'll possibly run quick preference tests or surveys; then when the project launches I'll user test again. And depending on what tools are available there might be more.
Using this process with clients is how we know as we go along that we're designing something strong. Like a legal team trying to put together a strong case with as much evidence as possible to back up our argument.
There are still plenty of designers who sit there in a vacuum with just a brief and Sketch open hoping to come up with some magic solution. Or startup founders who think they have a problem and code up the first solution that comes to mind. This is designing based on assumption and guesswork.
Ultimately being evidence-based is about avoiding guesswork, which is the message I promoted in my data-driven UX design book. The book’s content is still relevant, I think it's just time for an update and a new title... (if you want to download the current version in the meantime, here’s a link for 30% off).
Some further interesting articles that has helped my thinking on the data/evidence challenge:
Sign up to my mailing list and you'll receive my guide to tools for evidence-based design AND my ecommerce UX cheat sheet.
Where I explain why A/B testing may not be right for your website. It's probably better to focus your efforts elsewhere.
A step-by-step guide to my process and the tools I use at different stages when running evidence-based UX design projects for clients.
Where I share my top advice for doing effective user testing on the platform usertesting.com including smart recruiting and getting good results when you can't meet the user in-person.