Project Phase 6: User Testing & Final Release
Overview. You’re in the home stretch! The last stage of your final project will involve conducting the user tests you’d planned and scheduled last week, making any feasible updates, and polishing your implementation with an eye towards smoothening out rough edges to your app’s functionality, robustness, and usability.
Due Dates. Your final build (and associated writeups) will be due by 11:59pm on Wed, Dec 13. Your demo video, however, will be due by noon on Mon, Dec 11 as we will watch all the videos in class as part of a final project showcase celebration!
Your Tasks
-
Conduct User Tests. Last week, you planned and scheduled tests with 3 different participants. This week, execute on this plan. After getting their consent, and prepping them about their role and expectations, ask each participant to perform the tasks in your task list, following the order defined in your table. Encourage them to think out loud—and prompt them to do so if they fall silent. If they get stuck, give them a chance to get unstuck first—only intervene if they are really unable to make progress. Try to say as little as possible, and avoid explaining the user interface to them.
In the final 20 minutes of your hour session, debrief your participant to get their overall thoughts and impressions of your application. What did they think worked well, versus what could be improved? Dig into moments you noticed them hesitate, get confused, or get stuck—what did they find confusing, what were they hoping to do, how did they figure things out?
-
Report and Reflect. For each study, write a brief report (~half a page each) that summarizes and analyzes key moments of participant behavior—i.e., what interesting or unexpected things did you notice, and why do you think they occurred. For instance, you might observe a participant struggle with a particular interface element or interaction flow—your analysis might then rely on your debriefing to describe what were they expecting to do, what did the interface do instead, across which gulf did the flow break down, etc. Aim for your report to be balanced between reporting positive and negative results.
Follow these summaries up with a bulleted list of 3–5 flaws or opportunities for improvement. Describe what the flaw or opportunity is, explain why it is currently occuring, and brainstorm ways that future designs and implementations might be able to address it. For each bullet, classify it by level (physical, linguistic, or conceptual) and degree of severity (minor, moderate, major, critical). For instance, minor issues introduce some friction to the experience that, while annoying, a participant can recover from and move on; critical issues, however, bottleneck participants so severely that they required your intervention to make further progress. Moderate and major issues fall along that spectrum.
-
Finalize Implementation. Wrap up your implementation, with an eye towards polishing any rough edges that remained with your beta release, and addressing any flaws/opportunities that are feasible to do so in the remaining time.
-
Design Revision. Copy and add to the list of design revisions you began compiling with the beta release. Document any changes you have made from the design you envisaged in P3: Convergent Design, changes made since the beta release, or changes made in response to your user testing results. For each change, write down a brief (1–2 sentence) rationale for why the change was necessary.
-
Record a Demo Video. Make a brief video demonstrating the key functionality of your application. Your video should be no longer than 3 minutes. As a result, focus on the aspects that differentiate your app from others. Do not spend time demonstrating routine features (e.g., creating accounts, logging in, etc.), and be wary of spending too much time on pure narration without accompanying demonstration (i.e., don’t ramble!).
Submission
By noon on Mon, Dec 11, post a URL to your demo video to a spreadsheet that will be distributed closer to the date. If you’re hosting the video on Google Drive (or another file sharing service), make sure the permissions are set accordingly such that the file is publicly viewable and accessible. A good way to test this is to open the URL in an incognito window in your favorite browser.
By the assignment deadline, post your write ups to each team member’s portfolio, and deploy your final release to a publicly accessible URL.
Then, submit this Google form once for the whole team.
Grading
Component | Excellent | Satisfactory | Poor |
---|---|---|---|
Study Reports | Well-balanced reports that go beyond straightforward reporting of observations to richly analyze what caused interesting participant behavior. Analyses are well-grounded in evidence (e.g., participant quotes, your own observations and inferences, etc.). | Well-balanced reports, but focus primarily on reporting results with only preliminary analysis. Some evidence is provided but has an unclear connection to the observations/analyses or is otherwise uninformative. | Reports are unbalanced—overly focusing on either the positive or negative results—and/or miss several opportunities for analyzing results. Little to no evidence is provided to concretely ground observations/insights. |
Design Flaws/Opportunities | At least 3 compelling flaws/opportunities are bulleted that span different levels and severities. Every bullet conveys a crisp, descriptive definition with rich explanations of how the flaw manifested and ways to address it in the future. All bullets are grounded in evidence from the study results. | At least 3 interesting flaws/opportunities are bulleted. Bullet points convey good descriptions, but are occasionally difficult to understand because they are too high-level. A more diverse range of levels or severities could have been explored. Explanations provide reasonable evidence, but are occasionally shallow in brainstorming future ideas. | 3 or fewer flaws/opportunities are bulleted. They cluster around specific levels/severities, or surface issues that are trivial or did not need a user study to identify. Definitions are vague, and explanations are shallow. |
Functionality | The app supports all the functionality expected given the design revisions. It delivers on the initial motivations set forth in the impact case. | The app supports all the functionality expected given the design revisions, but only partially delivers on the motivations described in the impact case. | Some key functionality remains missing or is sufficiently incomplete as to undermine the impact case. |
Robustness | Where appropriate, backend or client-side validation and error checking is performed; informative error messages are shown | Backend or client-side validation is performed to detect errors, but some missed opportunities to improve robustness or user-friendliness with regards to error messages. | Several parts of the backend or frontend fail to perform validation to catch errors. Error messages are missing or opaque. |
Usability | Visual and interaction design help users learn novel concepts, and provide an efficient and intuitive flow through the app. | Visual and interaction design largely provides an intuitive user experience, but some issues present occasional usability frictions. | Minimal visual or interaction design yields an app that is difficult or frustrating to use. |
Styling & Layout | Layout and styling give the app an attractive, cohesive, and distinctive feel. | Attention has been paid to styling and layout, but occasional rough edges undermine cohesion or attractiveness. | Layout and styling are minimal and fail to provide a cohesive feel. |
Demo Video | The video conveys the gist of the app, focusing on the central aspects that differentiate it from potential competitors. It is fun to watch! | The video largely conveys the gist of the app, but occasionally focuses on routine aspects. | The video does not demonstrate the app in a very coherent or understandable fashion, or rambles. |