The web, and everything within it, is a fast paced entity. It’s constantly changing and adapting, and so are its users.
Building for the web is more akin to product design than traditional graphic design. So while we like our website to sparkle (who doesn’t?), we need to make sure it actually works for the people using it.
If it looks pretty, you will impress and soothe people for the first two minutes. However if a user can’t work your website, soon they will get frustrated if they can’t complete the task they have come to do, be it something transactional or as simple as consuming content.
As well versed designers and builders of the web, we can sit across the table and say “In our experience of how users will use insert thing here, we would suggest doing the following” and customers knowledgeable of their user base can sit at the other end commenting “well our experience of our users is that they will actually do X, Y and Z”. What trumps all of that is hard and fast data, which tells us EXACTLY what users are doing.
A large part of the Spacecraft design process is the discovery phase, and a large part of the discovery phase is looking at the stats; the data, the good stuff. Using analytics tools such as Google Analytics, we can see which pages are popular, where pages are being clicked on, and which devices and browsers are being used.
Simply, by knowing how to interpret this data we can really use it to feed into how we make design decisions on that project. What are the top used pages? Do they have a presence on the homepage? What are some of the pages with a high bounce rate? How much content do they have and are they task focused? What’s getting zero clicks on the homepage? Do they need to be there? What devices are people using? These are the questions we’re asking, and so should you.
We shouldn’t stop there though. If we can use analytics and pinpoint where a current site is failing, work out what we need to do to remedy that, and then implement a solution, we should always remember to test what we have done. Launch the feature, give it a while, and look at the stats again. Are they doing what you anticipated they would? Is it to as a greater extent? How could you build on what you have learnt? More questions; valid questions.
Qualitative over quantitative
While Google Analytics can be a great example of quantitative data (this is giving us raw statistical information), what can be harder to obtain but be a lot richer is qualitative data. Qualitative data that goes beyond the % sign and into the unknown. More simply, it’s speaking to the people who will use the site you’re creating and conducting some user testing with them.
Stats and figures are great at giving us an overview of exactly how many people are doing what, but what they can’t tell us is how people feel when they are doing the task. If an interaction made them think twice, or how reassured they felt on their journey through the site. User testing can be anything from standing in a reception (preferably the reception of the building for the site you’re working on obviously) and asking people to have a go on the site with an iPad, or can go as far as filming people in controlled conditions, giving them set tasks and asking them to describe what they are doing or attempting to do.
This data is a lot harder to turn into a decision, but it ensures that a site keeps the human quality to it, as it is humans after all who will be using it.
Addressing the masses
While it can be daunting to step out towards the users and see what’s what, it works and it leads to better sites and products. The recent Jadu Academy in London shows how much we believe this methodology to be true, as a large part of the event was asking our users what they wanted to talk about and what we needed to improve on. While it’s not always easy to hear you’re not doing exactly the right thing all of the time, there is so much value in there. You can learn things which will help you think of new features, or even just to know to pick up the phone more.
Originally posted on the Jadu blog here: www.jadu.net/blog/TheJaduBlog/post/83/data-driven-decision-making