It’s one thing to show your clients a demonstration of your program, and quite a different experience watching actual users running your software!!
Ideally, you would have a few representatives of the target group(s) run your software as soon as you have a working set of user views and enough underlying code that users can perform some tasks to completion. You don’t usually need many people (3 per target group is generally considered enough to get useful feedback) to uncover issues such as confusing user interfaces or unexpected workflows that would frustrate your end users after release … or may cause your program’s functions to operate in unexpected sequences.
Ideally, usability tests are recorded (sometimes from multiple angles) and observed by several people. Testers may have a sheet of tasks that they are required to perform or information they are expected to find. And they are sometimes asked to complete a survey as well. These can be very lightweight or in-depth tests, depending upon the importance and the budget for the project.
While it can be hard to ask for feedback and let other people try out your code (especially while you watch), it is time to ask someone to test your website for you … and give you feedback on how it functions.
Because of social distancing requirements, you won’t be required to do this testing in person. This will be an opportunity to see how usability testing might happen when working remotely with a client or for a geographically-distributed team.
- Usability Testing (Wikipedia)
What to Do
- Find a few people (2 or 3 people who can work with you via email/chat/whatever) who will be willing to try out your website and give you constructive criticism. Preferably, these people should not be your classmates (they know too much).
- Explain to your testers what your website is supposed to do. What tasks should they be able to perform with it? Or what information should they be able to locate?
- If possible, use a video app that allows screen sharing and have your tester open your website and share their screen, then sit back with a pad of paper and watch. Make notes about their successes and challenges and any comments they make.
- If applicable, you might give your testers a form with a list of tasks that they should be able to do. They should check off any tasks that they CAN do and make notes about their reactions to the site as they did the task. Was it easy to navigate and read, for instance.
- Try NOT to answer questions or offer advice on where to find things or how to get out of deadends. Just write down where they happen.
- If you are not able to do screen sharing for whatever reason, ask your testers to perform the tasks and write their experiences and responses the best they can. This is not ideal, but we’re making due in the difficult situation.
What to Hand In
Hand in a document that addresses the following bullet points:
- Part of the challenge of running a usability test is identifying the key characteristics of people who will be running your program or accessing a website. In this case, how would you describe the intended users of your web app? Gender, age, technical ability, etc.
- Type up a summary of your usability session. Who did you ask to test and why? What comments did they make about the site? What, if any, problems did they have while using it?
- What changes will you make, if any, based on their feedback?
- What were the challenges you experienced as you observed them? (If you were able to.)
- If you did this exercise with your team project, what do you think would be some of the challenges you’d experience?
If you had the testers complete a form, please include it with your assignment document.
Scoring Rubric (10 points)
- Describes the intended users of the web app. Gender, age, technical ability, etc.
- Identifies user 1 and why they were chosen
- Identifies user 2 and why they were chosen
- Includes a summary of the usability session or information that was sent to the users if a session was not possible.
- Lists the tasks that users were asked to complete
- Includes comments the testers made about the site
- Answers “What, if any, problems did testers have while using the site?”
- Answers “What changes will be made, if any, based on the feedback?”
- Describes the challenges they experienced as observers
- Compares this exercise to how it might apply to the team project and what the results might be
- Includes the form that the users recorded feedback on (extra credit)