Subscribe to our blog
Stay up-to-date with all our weekly blog posts.
User testing, usability testing—whatever term you’re more familiar with, they both serve the same purpose: to test your product with real users before you go to market.
Today, we’re going to walk you through the typical process we use at MindSea for conducting user testing on behalf of our clients and their mobile apps.
Our blueprint process includes a streamlined approach to user testing that ensures consistency and insightful results.
The original plan was to only share this resource with our clients, but we figured others might find it valuable too.
Let’s get right to it.
So what is user testing and why is it important?
User testing is a standard part of the design process across the tech industry. The main goals of user testing are to uncover any usability issues with the navigation and core features of the app, and to test any assumptions we have about the product.
Here are some examples of assumptions you may want to address before getting started:
The things we assume about our audience and our product aren’t always true. Often our assumptions are based on personal preferences and gut instinct. That won’t cut it long term, which is why we need qualitative data to back up our claims.
Follow these five steps to conduct your user testing session, from prep to results:
For effective user testing, you need participants who possess the characteristics of your early adopter and early majority personas. But where do you find these people and how do you get them to agree to test your prototype?
We suggest sourcing our testers from wherever you can find them. Sometimes, that means reaching out to your personal network—family, friends, friends of friends, family friends—you get where we’re going here! Send out an email or a post on social media making the request.
A lot of people may help you for the chance to test out a new product, but some others may need a bit of a nudge. That’s why we like to offer an incentive.
Just a small one – nothing crazy. We suggest offering Amazon, iTunes, Tim Hortons or Visa gift cards in exchange for user testing your prototype.
The earlier you start finding testers, the better.
It can be challenging for people to find time in their busy schedules to sit down for an hour and test, so we suggest that you begin searching for testers when you first start planning and strategizing your build.
You can use a tool like Calendly to let testers easily book a time slot that fits their schedule.
The rule of thumb is to over-recruit. Out of 20 people you email, only a few are likely to follow through with testing—or even respond to your email.
We suggest testing with 5–8 users, which will give you a decent pool without diluting the results. Once you get past 5 testers, you’ll start receiving a lot of similar feedback.
If you want to learn more about why you only need a small group of testers, check out “Why You Only Need to Test with 5 Users” by Nielsen Norman Group.
What you test depends on what you’re trying to achieve. It’s best not to test too many things at once, but rather have one main objective that will drive the process. To figure out that objective, you need to ask yourself a few questions:
1. At what stage in the development process are you?
If you’re pre-development, the goal of your testing may be to home in on the wants and needs of your target users. What problems are they trying to solve? What other solutions are they already seeking out? At this stage, your testing will be more research-oriented.
If you’re in development, your goal may be to confirm your direction and make sure your users are responding to the app design as envisioned. This is the type of testing we’re primarily addressing in this piece.
If your app is live, you might conduct a user test to ensure you’re keeping up with changing user preferences and expectations, as well as managing bugs. After all, an app is never truly done; you have to stay on top of its performance and audience reception so that you can make adjustments as necessary.
2. What are the main user journeys of the app?
What are the various interactions your user will have with your app from the moment it is downloaded to the point of completing whatever your app enables them to do? Charting out these steps can help you better understand where to focus your test.
3. What could be considered a risk?
To the same tune, it’s imperative to be on the lookout for any touchpoints where your user may get stuck or confused—for example, unusual UX patterns, an unclear order of actions, complex features, etc.
Once you’ve created a list of what you want to test, it’s important to turn this list into a script or questionnaire so that you remain consistent from one user test to another.
The testing script will ensure that your sessions are consistent, but don’t be afraid to deviate from the script and ask follow-up questions so that you can really understand the user’s perspective.
We recommend using Google Forms to take notes and more easily consolidate your findings. Google Forms summarizes things nicely. It gathers all of the responses to each question so that you can see the collective answers of everyone who completed the test:
It can also take compiled multiple choice answers and turn them into graphs, like so:
The testing setup is more important than you may think.
First, the participant should sit beside you with the phone placed on a table in front of them. If the user is holding the phone or sitting too far away, you won’t be able to see the screen or what they’re trying to do. Have your computer in front of you with the Google Form open so you can read from the script and take notes as you go.
We like to record our testing sessions (with the tester’s permission, of course) using QuickTime’s screen recording feature. We do this so that if we need to go back and revisit what a participant did, we can. We don’t share these videos with anyone outside of the MindSea team.
To start recording, plug the device into your computer and open QuickTime.
Go to File > New Movie Recording.
By default, QuickTime will show your webcam. Click on the down arrow next to the record button and choose the phone that’s plugged in. Then start recording.
Before your scheduled testing session, make sure you have the following on hand (the participants don’t need to bring anything besides themselves):
We use InVision’s iOS mobile app for testing.
Before testing, we would send our client (or the person conducting the test) a link to the prototype. To proceed, the testing participant opens the link on the device they’ll be using. Once the prototype loads, you’re ready to start!
Not all areas of the prototype will be interactive. If the user taps an area of the prototype that is not interactive, areas that are interactive will be highlighted in blue (like in the image below). If you or your testers get stuck, you can always jump back to the “All Screens” view, or exit the prototype by pressing and holding.
We encourage participants to think out loud as much as possible during testing sessions. That way we can capture exactly what pops into their minds while they use the prototype.
Some participants may not feel comfortable thinking out loud, but reading a session introduction script like the one below can put them at ease. Remember: The quieter the tester, the less valuable the session. So to get people talking, you may have to prompt them to speak out loud. Don’t worry about memorizing the script; it will be in the Google Form for you 😋.
Here’s a script you can use to begin the session:
“Let’s review the prototype. The first thing I want to make clear is that we’re testing the prototype, not you. You can’t do anything wrong.
The prototype is not a fully functioning app; not every button does something, but you’ll notice that when you tap somewhere on the screen that isn’t a button, blue squares will appear to show you what is tappable.
As you navigate the prototype, I’m going to ask you to perform some specific tasks. Try to think out loud as much as possible. Tell me what you’re looking at, what you’re trying to do, and what you’re thinking. This is the information we are looking for.
Also, don’t worry that you’re going to hurt our feelings. We’re doing this to improve the app, so we need your candid feedback.
If you have any questions, you can ask them; however I may not be able to answer them right away. We’re interested in how people will navigate the app when they don’t have someone there to help them. If you still have questions when we’re done, I’ll try to answer them then.
For the purposes of the tasks, you should behave as if you’ve downloaded the app for personal use and are using it for the first time.”
Avoid giving the user too much information. For example, if a user asks, “What does X mean?”, a good response would be, “What do you think it means?” Ask open-ended questions—rather than yes/no questions—as much as possible.
If a user is really stuck and can’t figure out a task after many failed attempts, it is OK to help them. Remember to reiterate that they didn’t do anything wrong, and that ultimately they’re helping you figure out what you need to improve.
Here are some good questions to ask:
It’s also important to pay attention to nonverbal cues from the user.
What are they tapping on? Where are they trying to scroll? Where did they initially go to find something? This nonverbal information can sometimes be more important than what the user is telling you.
A participant may tell you that completing a specific task was easy even though you observed them make three failed attempts before successfully completing. It’s extremely important to capture that information. Which leads us to…
When you take notes on what your participants are saying, the more detailed the better! If possible, write down what they say verbatim, including any questions they ask. But remember also to note what their fingers are doing, where they get confused, and when they complete a task very easily.
When the session is over, be sure to thank your participants for their time! If you picked up an incentive for your testers, now is the time to give it to them.
Again, we aim to test with 5–8 participants, and after the first few sessions, we typically see patterns emerging. This feedback tells us where to focus our attention in order to improve the product.
After all the testing sessions are complete, we gather the information into a single document and highlight the areas where we suggest changes should be made.
So there you have it: MindSea’s complete user testing process! If you found this guide valuable and think we can give you a hand with your own user testing (and mobile app planning), please don’t hesitate to get in touch.