Case-based exercises fail to improve medical students' information management skills: a controlled trial
© Chumley et al. 2006
Received: 23 December 2005
Accepted: 01 March 2006
Published: 01 March 2006
Skip to main content
© Chumley et al. 2006
Received: 23 December 2005
Accepted: 01 March 2006
Published: 01 March 2006
Tomorrow's physicians must learn to access, retrieve, integrate and apply current information into ambulatory patient encounters, yet few medical schools teach 'real time' information management.
We compared two groups of clerkship students' information management skills using a standardized patient case. The intervention group participated in case-based discussions including exercises that required them to manage new information. The control group completed the same case discussions without information management exercises.
After five weeks, there was no significant difference between the control and intervention groups' scores on the standardized patient case. However, third rotation students significantly outperformed first rotation students.
Case-based exercises to teach information management failed to improve students' performance on a standardized patient case. Increased number of clinical rotations was associated with improved performance.
When today's medical students graduate, they will conduct patient encounters using multiple technology-enhanced decision support systems. Current medical student training in ambulatory settings may not prepare students for this type of practice. Students often learn from physicians who generate few (0.01 to 0.8) clinical questions per patient encounter and infrequently use information technology to answer clinical questions at the point of care [1–3] Clinicians have previously reported that answering clinical questions is too time consuming to be practical during clinical sessions[4, 5] However, that situation is changing. Improved information management tools, including personal digital assistants (PDAs) and Internet capable wireless computers, now allow rapid access to Web-based clinical information in ambulatory settings.
Medical students must develop information management skills as a routine, integral part of the ambulatory patient encounter. Information management skills include: asking and refining clinical questions; accessing, retrieving, integrating and applying information into a clinical situation; and managing the doctor/patient/technology interface. Managing information at the point of care requires different skills than traditional evidence-based medicine (EBM) as taught in most medical schools  In traditional EBM, the learner develops a clinical question, performs a literature search, selects and appraises an appropriate research study, and draws conclusions. The EBM process usually occurs remotely from the patient encounter and requires time, the ability to understand the source literature, and critical appraisal skills.
While EBM skills are important for medical learners, many clinical questions can be answered at the point of care without critical appraisal. For example, using an electronic drug database in a PDA to investigate potential drug interactions is information access and retrieval. Modifying the patient's medications to adjust for interactions is information integration and application. The learner accesses an information source he/she deems reliable, and finds the answer in approximately 20 seconds during the office visit.
Few studies have investigated students' ability to manage information in clinical settings. In a study by Bergus and colleagues, fourth year medical students evaluated a standardized patient (SP), then read an article about a diagnostic test relevant to the patient's presentation. Most students appraised the article correctly, but few could apply the information to the individual patient In contrast, Webershock and colleagues demonstrated that, following an EBM seminar, third-year students could both appraise an article and integrate that information into a paper case Davidson and colleagues conducted a more complex skills assessment by having SPs ask a question of third year medical students. Students then formulated a clinical question, performed a Medline search, selected and appraised a journal article and transmitted results to the patient. Students did well in this applied EBM exercise, averaging 3.7–4.0 on a scale of 1 (poor) to 5 (superior) for each task as evaluated by faculty and librarians In summary, after a non-clinical EBM course, graduating students demonstrated EBM knowledge but had difficulty applying EBM information in the clinical setting. However, third year students applied EBM skills adequately to a paper case and standardized patient encounter given structured directions and sufficient time
A 2005 literature search yielded no previous studies investigating students' abilities to access, retrieve, integrate and apply information in real time patient encounters. In this study, we investigated whether case-based discussions with information management exercises improved students' information management skills as evaluated on a standardized patient case. We compared (1) intervention and control groups on two different clerkships and (2) first and third groups from both clerkships in academic year 2005–6.
The University of Kansas School of Medicine is a state medical school with 175 students per year and two clinical campuses. All 120 third year medical students on the Kansas City campus spend six weeks each in the Family Medicine (FM) and Ambulatory Medicine/Geriatrics (AM/G) clerkships. These clerkships are always adjacent. Students either complete six weeks of FM followed by six weeks of AM/G or vice versa. In our study, all 55 students from the 2005 fall semester completed the information-management SP case in week five of the 12-week FM and AM/G adjacent clerkships. The SP case was a required activity; however, students signed consents to include their results in the analysis. Our human subjects committee granted the study exempt status.
In a prospective controlled group study of 55 third year students, we compared our intervention and control group performance on an information management SP case. Students who began the twelve weeks on FM formed the intervention group. Those who began on AM/G formed the control group. After six weeks, the students swapped clerkships. Students on both clerkships recorded similar numbers and types of ambulatory encounters in their patient logs. Neither clerkship offers any formal EBM curriculum. However, all students learned EBM principles (but not clinical information management) through longitudinal didactic sessions across their first two years.
A healthy 55-year-old woman traveling to Botswana presents for travel advice. Her health maintenance is current except for her tetanus booster.
Patient enquires about travel immunizations and prophylactic medications for Botswana
1. access CDC Website (or similar) via Internet for current travel recommendations for Botswana
2. offer malaria prophylaxis
3. enquire about planned activities (e.g. providing health care)
4. provide information about food/water safety
5. advise re preventing mosquito bites
Patient indicates her preference for every day malaria prophylaxis
Students should access Web or PDA, and suggest doxycycline for malaria prophylaxis *
Patient asks if doxycycline will interfere with her antacids
Students should run an interaction check on their PDAs*
Information management exercises in the case-based discussions
Information Management Exercise
Access and retrieve
Integrate and apply
Coronary Artery Disease
risk calculators http://www.sgim.org (Society of General Internal Medicine)
select appropriate cardiac diagnostic studies
Chronic Obstructive Pulmonary Disease
pulmonary function tests tutorials http://www.vh.org (University of Iowa Virtual Hospital)
guide diagnosis and management
Type 2 Diabetes Mellitus
standards of care http://www.diabetes.org (American Diabetes Association)
develop a management plan
national guidelines http://www.guidelines.gov (National Guidelines Clearinghouse)
determine if imaging is indicated
clinical preventive services guide http://www.preventiveservices.ahrq.gov/ (Agency for Healthcare Research and Quality)
develop prevention plans for different age and gender patients
Percentage of student completing checklist items
FM (n = 19)
A/G (n = 23)
1st(n = 24)
3rd(n = 18)
ALL (n = 42)
Non-technology items (8)
Asked how long I would be staying
Asked if I had any medical problems
Asked if I take any medications
Asked if I smoked
Asked me about prior immunizations
Discussed protection from insect bites
Offered a tetanus shot
Offered any other type of injection
Technology items (13)
Asked if I would be providing health care on my travels
Asked if I would be working with animals on my travels
Asked if I would be outdoors or have contact with mosquitoes on my travels
Sat so that I could see the Internet information on the tablet screen
Offered a choice of medication options
Clearly outlined the recommendations for medications and shots
Accessed information using the tablet PC during the encounter
Accessed information using the palm pilot during the encounter
Discussed drinking water and food safety
Advised me not to swim in fresh water
Offered medication to prevent malaria
Told me about the potential interaction between my antacids and new medicine
Gave recommendations on how to take the medication with my antacids or advised me to stop antacids
Sub-score technology items (of 13)
OVERALL SCORE (Of 23)
Fifty-five students (26 intervention and 29 control) completed the standardized patient case and 42 (19 intervention and 23 control) consented to have their information included in the analysis (76%). There was a difference between consenters and non-consenters on total scores. Students who consented scored lower than those who did not. (14.0, 15.5, p = 0.05)
Percentages of consenting students completing each checklist item are shown in Table 2. Ninety percent of students accessed the Internet and 62% used a PDA during the encounter. Ninety-eight percent of students offered medication for malaria prophylaxis. Forty-eight percent asked if the patient would be working with animals, but only 10% inquired if she would be providing health care. Although the SP questioned every student about medication interactions, only 38% confirmed a potential interaction between doxycycline and antacids and of these 17 students, 71% (12) discussed how to take the medication to avoid the interaction.
There was no difference in total scores between intervention and control group students (13.4, 14.4, p = 0.12). However, students in both the intervention and control groups on the third rotation scored significantly higher than first rotation students. (13.6, 15, p = 0.038). However, when we examined the 13 technology items, there were no differences between intervention and control group scores (7.5, 7.9, p = 0.25) or between first and third rotation group scores (7.3, 8.3, p = 0.08).
Intervention group students who received case-based training in information management scored no higher on the SP case than students who received no such training. Possible explanations include that the intervention did not adequately teach information management skills or that the SP case did not accurately evaluate those skills. Also, students may have failed to apply/transfer the information management skills from the case-base discussions to the SP experience because the learning context changed from paper cases to a mock clinical encounter. In addition, students do not learn the clinical content (travel medicine) during any third year rotation, so we were testing on a content area that we had not formally taught. However, we deliberately chose this content because we considered that the lack of formal teaching 'leveled the playing field' for finding travel information that is easily retrieved from the Internet. Third rotation students performed better than first rotation students on the total score, but not on the technology sub-score. Third rotation students' better performance may be explained by their increased clinical experience, or by the fact that they learned the case expectations from students in the previous groups. All groups' mean scores were lower than the usual average score of 75% seen across multiple validated SP cases in our setting.
All but one student accessed a reputable site (e.g. the Center for Disease Control) and retrieved travel information for Botswana, and most integrated and applied information pertaining to malaria prophylaxis (but not health care delivery or working with animals). Thus, if we group information management skills into two levels – Level 1 (access and retrieval) and Level 2 (integration and application), our students did demonstrate some skills on both levels. However, their performance on Level 2 skills (integration and application) was generally poor. Even in our best-performing group only 67% consulted their PDAs to investigate a potential drug interaction. Students' performance may have been adversely affected by the time pressure of a 20-minute encounter, although we consider this unlikely. On our SP case feedback form, we routinely question students on whether or not they had adequate time for the case. For most cases, 2–3 students per group report having insufficient time. For the travel case, no student reported having insufficient time.
Even when questioned directly about medication interactions, only 38% of all students consulted their PDAs to investigate a possible interaction. Most students wrongly informed the standardized patient that there was no potential interaction between doxycycline and antacids without consulting their PDA, a task that requires minimal time (20 seconds or less). Two plausible explanations for this lack of PDA use are that students did not know how to use the drug database program or they incorrectly assumed that they knew the answer. In our institution, all third year students are given PDAs and receive formal training on logging patient encounters, but they receive no formal training on using medical programs such as Epocrates©. Our experience with this highly Internet savvy generation is that most students quickly master the capabilities of their PDAs, but we have cannot report any empiric data on the numbers of students who could not operate the drug database versus those who chose not to consult it.
Our study has several limitations. We conducted it with a small number of learners, at a single institution, using a single SP case for evaluation. Global student competence cannot be reliably assessed by a single standardized patient case. However, in this pilot study, we were not evaluating individual students' global performance. Rather, we were measuring group performance on one particular skill set. We accounted for the suspected variation among student performance and the study had sufficient power to determine if one group of students performed better than another.
Managing clinical information at the point of care is an important skill for future physicians, yet the literature is sparse on teaching information management in ambulatory settings. Unfortunately, our curricular intervention did not impact students' information management performance as measured by an SP case. Student performance did improve with increasing clinical experience. One important finding was that few students accessed an available PDA database to investigate potential drug interactions even when directly prompted by the SP. More concerning was the finding that most students indicated incorrectly that there was no potential interaction. Wrongly answering patients' questions without using readily available information systems is an important and preventable source of medical errors. Medical educators must teach learners to develop information management habits that reduce medical errors, including prescribing errors. At our institution, we will re-evaluate our teaching methods, with the specific aim of teaching students to check for medication interactions at every encounter. Future studies must investigate other educational interventions that demonstrate improvements in students' information management skills.
We thank Carla Aamodt, MD and Ray Dahlberg, directors of the clinical skills laboratory for their assistance in refining the standardized patient case, recruiting and training standardized patients, and providing ongoing quality control of the standardized patient program. The clinical skills laboratory is funded by the Office of Medical Education at Kansas University School of Medicine.
This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.