Cayuse IRB
Collaborative task analysis, usability test, and redesign of research software.
Cayuse IRB is a software platform that simplifies the Institutional Review Board (IRB) process for research institutions. Researchers and administrators can submit, track, and review research proposals involving human participants, ensuring compliance with ethical standards and regulations like the HIPAA. By automating workflows and centralizing document management, Cayuse IRB reduces administrative burdens, streamlines the approval process, and enhances transparency. Its features include real-time updates, electronic signatures, and customizable review routes, all aimed at improving efficiency and maintaining ethical oversight in research.
The goal of this project was to map the requirements of the IRB submission process, assess the current platform’s efficiency, and propose a re-design through a lightweight prototype that addresses user pain points.
Research Goals:
1. Understand the ethics board submission process.
2. Identify workflow pain points that contribute to delays in proposal submissions.
3. Develop feasible design solutions to address identified issues.
Role and Timeframe
Role: I coordinated the research process, conducted usability testing, analyzed task flows, and developed prototypes with support from two peers.
Timeframe: 1 month
-
Study Planning and Design: 1 week
-
Task Analysis and Diagrams: 1 week
-
Usability Testing and Data Analysis: 1 week
-
Prototype Development: 1 week
Methods (5 Users)
Participants: Five graduate students with research training, but no prior experience with IRB submission.
Requirements Specification: The IRB submission process was broken down into a Hierarchical Task Analysis (HTA) diagram, outlining the steps users must take to complete their proposal.
Think-Aloud Protocol: Participants navigated the platform while verbalizing their thought process. They were tasked with submitting a mock proposal using data from a pre-filled research report. This approach helped identify the challenges and pain points encountered during submission.
Prototype Development: The issues identified during usability testing were mapped to the HTA diagram, providing insights into the inefficiencies in the current system. The feedback informed the targeted portions and design of the prototype.
Critical Insights
1. Navigation through the platform is confusing and unintuitive.
2. The use of technical jargon led to confusion, especially as first-time users.
3. Users were unclear when they had completed a section of the submission process.
Prioritized Recommendations
1. Simplify the submission process by reducing unnecessary interactions.
2. Implement clear, timely feedback and assistance throughout the process to guide users and reduce uncertainty.
Lessons Learned
- Advanced task analysis can reveal both the necessary complexity and
inefficiencies within a system, but it’s crucial to distinguish which is which to
propose effective solutions.
- Positive reinforcement and accessible support are essential
for reducing user uncertainty when onboarding to a new platform.