Overview
Adroit Studios and radiology educators from the University of Missouri School of Medicine and Children’s Hospital of Philadelphia collaborated to develop ScanBright, a radiology training app for mobile phones. To ensure the app met the needs of its target users, they sought feedback through focus group interviews. Our first prototype was used during these sessions to gather valuable insights for guiding the app’s design and development.

Problem Statement
Traditional training often leaves junior residents unprepared for high-stakes on-call shifts, where their interpretations directly impact patient outcomes. Mistakes can be valuable learning opportunities, but not when they may also harm patient care, affect resident well-being, and influence perceptions of trainees and academic institutions.

Users & Audience
The primary audience for this app is physicians-in-training preparing to enter patient-care environments. After just 6–12 months of training, junior residents are tasked with providing preliminary interpretations of imaging studies during overnight and weekend call shifts. The secondary audience includes radiology educators seeking to enhance their curricula.
Roles & Responsibilities
As the lead UX researcher and product owner, I guided the app's design with subject matter experts, a developer, a curriculum designer, and an artist. I developed focus group protocols and led sessions with physicians-in-training and radiology faculty. A junior researcher assisted with protocol review and conducted supervised thematic analysis.

Scope & Constraints
The project had to deliver two major releases within 12 months on a limited budget. The tight timeline required rapid development, which made it challenging to generate broad insights before making key design decisions.
Process
To ensure the app met user needs, I collaborated with subject matter experts (SMEs) to design a focus group protocol. The focus groups, conducted via Zoom, targeted a diverse group of radiology educators and trainees, including chief residents, invited lecturers, and peer mentors from various universities. Recruitment was facilitated through email, using personal connections with the SMEs. Each focus group was kept small—one with six residents and another with five educators—to encourage open discussion and allow ample space for each participant's input.
The focus group was designed to achieve several goals: to understand the challenges faced by on-call residents, gather feedback on perceptions of radiology games, and refine design elements for the app. The session began with open-ended questions aimed at gaining insight into the difficulties residents encounter during shifts and soliciting design recommendations. Following this, I led a comprehensive concept walkthrough, encouraging discussion on the game’s learning objectives, narrative, and gameplay mechanics, with particular attention given to scene-specific learning features.
After collecting the data, I supervised a thematic analysis of the feedback using MAXQDA, identifying key themes related to participants' reactions and suggestions. I then presented the findings to the development team and revised the product backlog based on these insights to better align the app with the needs and preferences of its users.
Outcomes
The study explored the challenges radiology residents face during on-call shifts and evaluated the potential of a radiology training game, ScanBright, using a qualitative, inductive approach. Focus group transcripts were analyzed using MAXQDA, applying a two-level coding procedure: first, descriptive codes were assigned, and then these were categorized into themes to address the research questions. This process yielded 118 codes, which were classified into three overarching themes.
Findings revealed that time management, prioritization, and experience were the primary challenges residents encountered. While educators were initially skeptical about the effectiveness of a game-based approach, they recognized the value of RadSwipe in reinforcing learning, particularly in helping residents defend their diagnostic decisions. Residents found the game useful for improving diagnostic fluency, with structured explanations of correct and incorrect answers being especially beneficial.
Feedback on ScanBright was positive, with both groups highlighting its potential to support skill development in a low-pressure environment. Design recommendations included enhanced usability features (e.g., pinch-to-zoom, image annotations) and engagement mechanics (e.g., short play sessions, streak-based rewards). A confidence rating system was also suggested to help residents identify and correct misconceptions.
Based on the focus group feedback and the need to balance timeline constraints with high-value user experience testing, the following design revisions were implemented:
Lessons Learned
In hindsight, incorporating stronger tutorials and engagement mechanics from the outset may have facilitated a deeper discussion of new potential features to build in future prototypes. However, overall, the study demonstrated that a radiology training game like ScanBright can be a valuable educational tool when it effectively balances depth, usability, and engagement.
Comments