Identifying the participants
We used a Google Form to identify students that match our ideal early adopter profile. We then picked ten students, five for testing our first app design and five for testing our second iteration. We chose five students for each group because five is the magic number when it comes to usability testing.
Setting up a user testing protocol
Conducting a usability study is still a new experience for us at Kindi. As such, we utilized insights from the MIT team as well as online resources to help develop our own user testing protocol.
Here are some insights from our process.
1) Pre-session – Define scenarios and expected user flows
Before sitting with participants, choose the essential features you want to test and the types of scenarios that would put the user in a position to explore such features. In parallel to this, list out the user flow required to complete each scenario. This will come in handy later on when you run sessions and can compare how well a user’s actions line up with your expected user flow.
2) Determine the interface participants will interact with
Are participants interacting with a functional app or a paper prototype? Make sure you have any materials and links ready to go before each session starts. In our tests, we first used a functional prototype followed by an interactive Invision design.
Without explaining what your technology does, introduce yourself and your project. Explain to the user what the next 30 minutes will be like, and inform them that this is not a test of their ability to use your technology, but rather your technology’s ability to help them complete a specific scenario. Also, make sure to ask participants to verbalize their thought process as they attempt to complete specified tasks. This will help you better understand why they are taking particular actions in the app.
4) Gather some background data
Who are you talking to? What is their relationship to the problem your technology is solving for?
5) Introduce Scenarios/take notes/video if you can
Introduce your first scenario. Take notes on the user’s actions. Keep each scenario independent of each other.
6) Conclude with some general user feedback
What did the user like? What did they dislike? What do they suggest changing and how?
7) Say thank you and explain what happens next
What are you planning to do with the data you gathered from the usability test? It’s important here to acknowledge how valuable each user’s time has been and how it will influence the next steps in your development process.
Top three takeaways from usability testing
1) QR codes are a useful tool for logging in
One issue we saw early on is that Kayany students struggled with traditional sign-up processes based on emails and passwords because they don’t have emails or other social accounts to log in with. So we tried QR codes, and the results were promising. If you are having issues with getting young refugee learners passed the sign-in page, then give QR codes a try.
2) If you use text, use simple words and phrases people understand
Students struggled with our first version sign-in screen because they did not understand the meaning of being a “buddy” or “learner.” After taking a step back, we also realized that these terms would be confusing for anyone, especially if they didn’t know anything about the app. In the next iteration, we simplified this process by using simple phrases users can relate to, such as, “start reading,” which eliminated sign-in issues in the second round of testing.
3) Simplify, simplify, simplify!
Sometimes when you’re deep into the app design process, you fail to recognize workflows that are actually creating unnecessary friction in your app. This is where usability testing helped us. For example, our first home screen (left image below) confused many users from group one. It had too much text and actually was getting in the way of users picking the story they needed to read. So in the second version (right image below), we replaced the first home screen with the a list of stories to choose from. This change removed an entire step from the user flow and led to much better outcomes in scenario testing with group two.
In conclusion, here are two things we can work on to improve the process in the future.
Break the ice
Many of the female students we are working with come from conservative families. As such, it was apparent that some students felt a certain level of discomfort when a non-Arabic speaking male foreigner asked them to participate in a usability test. Who is this guy and what is he talking about? Luckily, with help from my Arabic speaking teammates Ahmad and Leen, we were able to make things more comfortable as the sessions went on. However, in the future, we should use some simple icebreakers to make the mood less formal and more engaging for student participants.
It’s not a test
Some of the students we interacted with were nervous because they thought this was some kind of test that they needed to perform well on. When they couldn’t figure out a scenario, they would get anxious and start pressing all over the app. We tried to iterate multiple times that this was not a test of their skills, but rather a way for us to see how they interact with our designs. This, though, did not always help. We need to do a better job of defining the purpose of the usability test and the student’s role in that process. To start with, how about we stop using the word “test?”
What about you?
Any interesting insights you’ve gathered from your own usability sessions? Any suggestions on how we can improve our process in the future?