Usability testing is great when you know what to test. But what happens when you know that you should test, but you have no idea where to start? Look for small discrepancies, arguments or opinions; anything your team is in unsure about or in disagreement over. Once you tackle basic discussions, you can move onto statistically significant tests and start building a regular cadence.
1. Ask for Opinions
Asking for other’s opinions is a great way to get your feet wet and get insight not your own, without putting in a lot of extra investment. The goal here is to validate your assumptions and mitigate any cognitive biases. Even just one person’s outside opinion is enough to start thinking outside your brain.
To ask for opinions, just find someone you know or work with, who isn’t on your immediate team. Team members have the same biases and assumptions that you have. A developer on a different project, someone down the hall or even your aunt can all be great test subjects. Bonus points if you can find someone in a coffee shop or not related to you.
“What do you think about this? Would you use this? Do you like how it looks?”
- Gets your thought process outside of your head
- Able to validate decisions that you or your team has made
- Little to no time investment and no financial amount needed
- Does not map to your customer base
- Not statistically significant
2. Ask for Clarifications
Use data you have to find questions. Quantitative data of any nature: analytics, heat-maps, clickstreams, or conversion rates can help you pinpoint where there are potential issues. Is there a page in your analytics where people spend a long time lingering? Does a specific workflow almost always get abandoned? You’ve got the what but you need the why. Take the “what” to your user and either ask them “why” it happened or have them run through it and see for yourself.
Clarifications work best with users, potential customers or unrelated humans that are a similar demographic to your user base. You can still use Dave from down the hall, but your goal is to get as close to users as possible.
“What is keeping you from making a purchase? Is there a part of [workflow] that is particularly challenging?”
- Find meaning in the data you already have
- Still does not require a financial or substantial time investment
- Learn how people are using your system
- Find problems in current workflows
- Users might not know the “why”
- You have to find the issues that need clarification
- Not statistically significant
3. Ask for Feature Viability
Out of each of these steps, feature viability is probably the one your company is already doing on a high level. Market research is a mature field, and there are many articles on best practices.
The goal here isn’t to look for a product dud; that decision has already been made. Before starting a massive epic in agile or a new workflow, test it. Creating a simple prototype and getting it to users is a lot cheaper than finding it later. “An estimated 50% of engineering time is spent on doing rework that could have been avoided” — the goal here is to reduce that rework.
You have a lot of options to measure feature viability. You could directly ask the users, though that is still based on opinion and not statistically significant data. Asking for feature viability also doesn’t have to mean directly asking. Start a beta group, offer exclusive previews and then ask them their thoughts. Set up analytics along the way and measure engagement. Create a survey and link it on the website, asking user’s opinions.
“Is this something you would use on a weekly basis? We’re rolling out a new beta program. Sign up if you’re interested!”
- Able to validate feature viability before you have to invest developer hours
- Estimate usage and importance without waiting until it is on production
- Still can be done for free
- Still opinion based
- Not statistically significant
- Always a chance the larger group might not feel the same way
- Sometimes the business might want to make it anyway
- Requires more time invested than the previous steps
4. Ask for Task Completion
Now we’re at the good stuff. Task completion gets you meaningful, comparable, statistically significant data. You gain quotes, stories and actionable insights from your users. Test with your actual user base, potential customers or someone as close to your user base as possible.
You’ll need a testing plan with scenarios or tasks for users to run through. Ask them to perform goals, not for their opinions. Get users to think aloud and tell them its okay to get stuck. Your prototype or production app might be hard to use, and users may fail. And that’s great! If you’ve found failures, you’ve seen where people are getting stuck.
Test with around five users. At five users, you can find approximately 80% of your potential usability problems. If you’ve got the budget for 10 and 20, great. Save those for iterations after your first test. The first test will find your problems, but our ultimate goal is to find solutions. If all you do is test the bad stuff (and not the new stuff to fix the bad stuff), you don’t know if you’re fixing the problem.
Find a pink umbrella and add it to your cart. If you want to compare umbrella prices, where would you look?
- Statistically significant (if you have enough users)
- Can find 80%+ of usability problems
- Focuses on workflows and larger (more expensive) potential issues
- More work than the other options
- You’re never going to get to a 100% completion rate
- Requires more company buy-in and a possible monetary investment
- Potentially time-consuming
5. Always Keep Asking
You don’t need to get to task completion and formalized workflows to get iterative feedback. Even just asking for opinions outside of your team can help you to avoid “blind spots” and overly complex scenarios. While keeping a regular cadence can help keep you on track, irregular feedback iterations can still provide value. Get into a habit of asking people outside your own dev team for their thoughts. The process will soon become second nature.
Turn User Goals Into Scenarios for Usability Testing — From Nielsen Norman Group, one of the founders of usability testing. This article does a great job of talking through various tasks and scenarios that you should build out for a user test
Usability Testing Examples — Learn by seeing. TestLodge walks through a few sample tests that are dead simple to set up.