ABSTRACT: Just enough, but not too much interactivity leads to better clinical skills performance after a computer assisted learning module.
Well-designed computer-assisted instruction (CAI) can potentially transform medical education. Yet little is known about whether specific design features such as direct manipulation of the content yield meaningful gains in clinical learning. We designed three versions of a multimedia module on the abdominal exam incorporating different types of interactivity.
As part of their physical diagnosis course, 162 second-year medical students were randomly assigned (1:1:1) to Watch, Click or Drag versions of the abdominal exam module. First, students’ prior knowledge, spatial ability, and prior experience with abdominal exams were assessed. After using the module, students took a posttest; demonstrated the abdominal exam on a standardized patient; and wrote structured notes of their findings.
Data from 143 students were analyzed. Baseline measures showed no differences among groups regarding prior knowledge, experience, or spatial ability. Overall there was no difference in knowledge across groups. However, physical exam scores were significantly higher for students in the Click group.
A mid-range level of behavioral interactivity was associated with small to moderate improvements in performance of clinical skills. These improvements were likely mediated by enhanced engagement with the material, within the bounds of learners’ cognitive capacity. These findings have implications for the design of CAI materials to teach procedural skills.