Summary and Setup

Usability underpins impact for all scientific software. If users can’t efficiently and effectively use your tool, they will switch to another or stop investigating that particular research question. Rapid usability testing empowers you to understand your software’s usability, improve it, and better serve users.

Is this lesson for you?


Do you have an idea of which features need to be fixed in your scientific software but are not sure which to prioritize?

Do you get the same questions or complaints from users all the time?

Do you want to level up your tool so that you can drive scientific impact?

If you answered yes to any of these, this lesson is for you. By learning to conduct rapid usability testing, you will have a new way to quickly assess your tool and identify meaningful improvements so users can do high-quality science confidently. This lesson will work for scientific apps, APIs, command line tools, web-based tools, or documentation.

Rapid Usability Testing Lesson Overview


This lesson is a five episode training on rapid usability testing. The lesson is intended to be delivered via Zoom and the exercises reflect this, though you can modify the materials for an in person workshop or other delivery format.

This lesson will teach you to:

  • Identify scenarios and tasks appropriate for rapid usability testing
  • Recruit for a user study and track participants’ data
  • Conduct a rapid usability assessment and analyze results

There are no prerequisites for this tutorial.

Tutorial events


This lesson was delivered to the US-RSE community on June 24, 2025.

Have you taught this lesson? Make a pull request to add your event and recording to the list above.

Would you like to teach this lesson? You can! Email us to say so at

Contributing


This lesson could use your help! Please see the CONTRIBUTING.md file for instructions.

The STRUDEL project maintains this lesson with support from the US-RSE user experience working group through funding from the Alfred P. Sloan Foundation, Liz Vu and Joshua Greenberg program managers, under Grant #G-2024-22557. The team’s thanks go to current and past contributors:

  • Lucy Andrews (content, feedback)
  • Kate Arneson (content development, feedback)
  • Erin Becker (consulting on format and approach, feedback, fixes)
  • Georgia Bullen (feedback)
  • Hannah Cohoon (content development, presentation)
  • Rajshree Deshmukh (feedback, presentation)
  • Eriol Fox (content development, feedback)
  • Mary Goldman (content development, feedback, presentation)
  • Toby Hodges (consulting on format and approach)
  • Jenny Knuth (content, feedback)
  • Anh Le (content development, feedback)
  • Cody O’Donnell (feedback)
  • Drew Paine (feedback)
  • Lavanya Ramakrishnan (feedback)
  • Maryam Vareth (feedback)
  • Kirstie Whitaker (content, feedback)

Thanks also go to the following organizations that have supported this effort:

Citation

Cohoon, J., Goldman, M., Arneson, K., Deshmukh, R., Fox, E., & Le, A. (2025, June 24). Lesson on Rapid Usability Testing. Zenodo. https://doi.org/10.5281/zenodo.16782384

License


This content is openly published with a CC-BY-4 license. You may reuse and remix it but should provide attribution using the citation above.

Copyright (c) 2025, The Regents of the University of California, through Lawrence Berkeley National Laboratory (subject to receipt of any required approvals from the U.S. Dept. of Energy). All rights reserved.

Contact


If you have feedback or wish to make a contribution, please follow the instructions in CONTRIBUTING.md. For other inquiries, please contact the STRUDEL team at .

Callout

Time estimates

We roughly estimate this content will take a group of learners about 3 hours to work through with an instructor. If you have taught or participated in this lesson and have feedback about timing, please submit an issue or PR. We assume that instructors will not present on callouts or spoilers (i.e. content highlighted in boxes like this one); learners can read those on their own.