- Research article
- Open Access
- Open Peer Review
A novel bedside cardiopulmonary physical diagnosis curriculum for internal medicine postgraduate training
BMC Medical Educationvolume 17, Article number: 182 (2017)
Physicians spend less time at the bedside in the modern hospital setting which has contributed to a decline in physical diagnosis, and in particular, cardiopulmonary examination skills. This trend may be a source of diagnostic error and threatens to erode the patient-physician relationship. We created a new bedside cardiopulmonary physical diagnosis curriculum and assessed its effects on post-graduate year-1 (PGY-1; interns) attitudes, confidence and skill.
One hundred five internal medicine interns in a large U.S. internal medicine residency program participated in the Advancing Bedside Cardiopulmonary Examination Skills (ACE) curriculum while rotating on a general medicine inpatient service between 2015 and 2017. Teaching sessions included exam demonstrations using healthy volunteers and real patients, imaging didactics, computer learning/high-fidelity simulation, and bedside teaching with experienced clinicians. Primary outcomes were attitudes, confidence and skill in the cardiopulmonary physical exam as determined by a self-assessment survey, and a validated online cardiovascular examination (CE).
Interns who participated in ACE (ACE interns) by mid-year more strongly agreed they had received adequate training in the cardiopulmonary exam compared with non-ACE interns. ACE interns were more confident than non-ACE interns in performing a cardiac exam, assessing the jugular venous pressure, distinguishing ‘a’ from ‘v’ waves, and classifying systolic murmurs as crescendo-decrescendo or holosystolic. Only ACE interns had a significant improvement in score on the mid-year CE.
A comprehensive bedside cardiopulmonary physical diagnosis curriculum improved trainee attitudes, confidence and skill in the cardiopulmonary examination. These results provide an opportunity to re-examine the way physical examination is taught and assessed in residency training programs.
Sir William Osler stated that “Medicine is learned by the bedside and not in the classroom.”  However, this tenet is being challenged in the modern hospital. The time that residents spend in direct contact with patients has decreased from over 20% of their workday in the 1990’s to less than to 10% in recent years [2,3,4,5]. Many factors contribute to this shift away from the bedside, including the electronic health record (EHR), duty hour regulations, and operational pressures in academic medical centers [6,7,8,9,10,11,12]. It is not surprising that less time at the bedside has contributed to a measurable decline in physical exam skills [13,14,15,16,17], in part due to a decreased emphasis on physical diagnosis teaching and practice [7, 15, 18, 19]. Alarmingly, some studies have shown that physical exam skills, particularly cardiovascular exams skills, peak during medical school and decline during residency and beyond [20,21,22]. Physical exam findings directly and immediately affect patient outcomes; a decline in exam skills could have adverse effects on patient care [23, 24]. In addition to its enduring importance in patient care, the physical exam is a ritual which plays an integral role in developing a meaningful and therapeutic relationship with a patient [25, 26]. This relationship is threatened by less time and lack of emphasis on the bedside encounter .
The usual approach to teaching the physical exam involves an introduction to basic techniques during the first two years of medical school, followed by more focused bedside experiences during clinical years. The United States Medical Licensing Examination Clinical Skills (USMLE-CS) examination requires medical students to examine standardized patients but does not directly assess their ability to correctly identify abnormalities on real patients . There is no standardized curriculum or formal assessment of physical examination skills mandated by the Accreditation Council for Graduate Medical Education (ACGME) for US residency training programs. Many internal medicine residency programs do not have physical examination curricula, and instead rely on individual attendings to provide instruction in physical exam technique and interpretation, leading to wide variability in trainee experience.
Given the high prevalence of cardiopulmonary disease in United States hospitals , improving cardiopulmonary exam skills among trainees has the potential to meaningfully impact a large number of patients. Cardiovascular physical exam skills peak during medical school and decline thereafter in non-cardiologists [20,21,22], providing an important opportunity for an educational intervention. The curriculum Advancing Bedside Cardiopulmonary Examination Skills (ACE), was developed to improve the cardiopulmonary physical diagnosis skills of trainees in a large internal medicine training program.
Physical examination instruction for residents prior to ACE
Before the introduction of ACE, there was no formal physical diagnosis curriculum for internal medicine residents in our program. There are several activities in which residents are exposed to physical diagnosis teaching, most notably during teaching rounds which occurs with an attending physician every morning, and during weekly activities with assigned faculty members. However, this experience is limited by the preferences of the attending and the findings of patients who are admitted to the service. There was no standardized approach to ensure that all residents received instruction in the same techniques and were able to accurately elicit and interpret physical exam findings.
Curriculum development of ACE
ACE was designed using a formal curriculum development process for medical education . The goals of ACE are to increase trainees’ appreciation for the importance of time spent at the bedside and to improve trainees’ use of the physical exam to diagnose cardiopulmonary disease. Table 1 lists the objectives of ACE. Objectives 1–3 are the focus of the current manuscript. All interns were invited to participate in ACE, including 53 from July 2015–June 2016, and 52 from July 2016–June 2017. 81.1% were in the categorical program (i.e. the standard 3-year internal medicine program), with the remainder in preliminary, primary care or combined medicine-pediatrics programs.
Educational delivery methods
Interns participated in ACE as they rotated through a general medicine service at The Johns Hopkins Hospital in Baltimore, Maryland. The service is staffed by one hospitalist attending, two PGY-2s and four interns. All interns rotated on the service in either one four-week block or two separate two-week blocks. The ACE curriculum was delivered during four 30-min morning teaching sessions each week during the rotation. On average, ACE interns received a total of eight hours of dedicated physical exam instruction over 16 sessions. Approximately 75% of those sessions were at the bedside of hospitalized patients with physical exam findings. Standardized patients were not used in the ACE curriculum. ACE sessions did not replace other program-based teaching sessions, such as daily noon conferences and intern report. Six experienced faculty in Pulmonary, Cardiology and General Internal Medicine facilitated the majority of the sessions. Learning activities included:
Introduction to the Cardiac Exam: Review of basic skills on a healthy volunteer or inpatient.
Introduction to the Pulmonary Exam: Review of basic skills on a healthy volunteer or inpatient.
Bedside Cardiopulmonary Physical Diagnosis Sessions: Review of physical exam manueuvers, exam findings and their relevance to patient care using hospitalized inpatients.
Physical Diagnosis and Echocardiography: Review of echocardiographic findings of a patient examined during a previous bedside session, emphasizing physiologic correlations, utility and limitations.
Mornings with the Masters: Bedside sessions with specialists from Pulmonary Medicine, Rheumatology, Nephrology, Geriatrics, General Internal Medicine, Neurology and Infectious Disease.
Interactive Computer Learning and High-Fidelity Simulation: Interactive session using the Harvey mannequin (Laredo, Wappinger Falls, NY) or an online cardiovascular module (Blaufuss, Rolling Hills Estates, CA).
Table 2 outlines a typical 2-week ACE schedule. Subsequent 2 week blocks replaced the introductory sessions with additional bedside cardiac and pulmonary sessions, depending on the learners present. In addition, learners accessed optional online materials including an interactive tutorial and cases from Blaufuss.10;11;22 Supplemental optional readings emphasized proper technique, the evidence behind maneuvers, and their relationship to physiology [30, 31].
The authors searched the literature and were unable to find an existing attitudinal instrument regarding the cardiopulmonary exam. A 14-item survey was designed to assess attitudes and confidence surrounding the cardiopulmonary exam. A team of content experts reviewed the final instrument to enhance content validity (BG, SD, EK, MC). Each item used a 5-point Likert scale to rate agreement or disagreement with a statement about the cardiopulmonary examination. The survey was administered to interns prior to the start of the 2015–2016 and 2016–2017 academic years, and to PGY-2s who had just completed their 2014–2015 intern year. The survey was re-administered to interns halfway through intern year in 2015–2016 and 2016–2017.
Cardiovascular skills assessment
The Blaufuss Cardiovascular Examination (CE) consists of 50 questions divided into four categories: physiology, auditory, visual and integration. Questions contain recordings of heart sounds as well as videos of the neck and precordium. Blaufuss developed the assessment by reviewing a 1993 published survey of internal medicine (IM) program directors and Accreditation Council for Graduate Medical Education requirements for IM and cardiology fellowship training. The assessment was reviewed and modified by six academic cardiologists. It has been delivered to over one thousand medical students, graduate trainees, and practicing physicians as a measure of cardiac exam skill. In general, performance on the assessment peaks during the third year of medical school, compared to other medical school years, internal medicine residency and general practice. Only cardiology fellows and cardiology faculty outperform other groups on the assessment [20,21,22]. Incoming interns from 2015–2016 and 2016–2017 took the CE two weeks prior to the start of intern year and at the midpoint of the year. PGY-2s took the CE within two weeks of completion of their 2014–2015 intern year.
Survey responses were compared using Mann-Whitney rank sum tests and Kruskal-Wallis one-way analysis of variance on ranks using the Likert scale median response. Dunn’s method was used for pairwise comparisons. CE data was analyzed using Mann-Whitney rank sum tests for intern and PGY-2 results and paired t-tests for intern mid-year CE assessments (where pre- and post-tests were available). Multilinear models stratified by exposure to ACE were run to compare overall test scores (post vs. pre) for each strata while adjusting for intern year, designation, and total pre-score. Generalized estimating equations were used to account for the repeated measures in the data (pre- and post-test score for each observation). Mutivariate linear regression was used to examine the effect of individual variables such as pre-test score, categorical versus other intern status, year of internship, gender, weeks participating in ACE, and weeks on ICU services on change in CE score at the midpoint of the year. A final adjusted model was fit including covariates that were significant at the alpha = 0.5 level in crude models. Hopkins residents’ pre-ACE performance on the CE was also compared to data reported in the literature on internal medicine residents who took the CE assessment [20, 22], using independent samples t-tests with pooled variances. A p-value less than 0.05 was considered statistically significant for all comparisons. Analyses were conducted using Sigmaplot (Systat Software, San Jose, CA) and SAS (SAS Institute, Inc., Cary, NC).
Baseline assessment of interns and PGY-2s
All 53 interns from 2015–2016 (100%), all 52 interns from 2016–2017 (100%), and 29 PGY-2s from 2015–2016 (60.4%) who had not previously participated in ACE completed the survey. Results are shown in Table 3. Interns and PGY-2s “strongly agreed” that the cardiopulmonary exam is an important part of patient assessment and that improving exam skills is an important goal for the next year of training. Both groups “somewhat agreed” they had received adequate training in the cardiopulmonary exam. Compared to the interns, PGY-2s felt more confident in their ability to distinguish systolic from diastolic murmurs (p = 0.023) and to characterize a systolic murmur as holosystolic or crescendo-decrescendo (p = 0.01). PGY-2s were also more comfortable with the jugular venous pressure (JVP) examination (p < 0.001), and in distinguishing ‘a’ waves from ‘v’ waves (p < 0.001).
52 interns from 2015–2016 (98%), 48 interns from 2016–2017 (92%) and 21 PGY-2s from 2015–2016 (44%) completed the cardiopulmonary examination (CE). Intern and PGY-2 scores were similar overall, and in all individual categories (See Fig. 1). There was no significant difference between intern and PGY-2 overall scores in our program compared to 451 internal medicine residents for whom scores were available in the literature (see Table 4) [20, 22].
Mid-year assessment of PGY-1s
The mid-year self-assessment survey was completed by 38 interns from 2015–2016 (72.0%) and 19 interns from 2016–2017 (36.5%) (see Table 5). At mid-year, 36 (63.2%) had participated in ACE (“ACE interns”), and 21 (36.8%) had not (“non-ACE interns”). Compared to non-ACE interns, ACE interns more strongly agreed that they had received adequate training in the cardiopulmonary exam (p = 0.001). Non-ACE interns agreed less strongly with this statement at the midpoint compared to the beginning of the year (p = 0.001).
ACE interns were more confident in their ability to perform a cardiac exam (p = 0.039), assess the JVP (p < 0.001), distinguish ‘a’ waves from ‘v’ waves (p < 0.001), and classify systolic murmurs as holosystolic or crescendo-decrescendo (p = 0.022). ACE interns felt more confident in their ability to distinguish a pleural effusion from consolidation on exam (p = 0.048).
38 interns from 2015–2016 (72.0%) and 36 interns from 2016–2017 (69%) completed the CE at the midpoint of the year. 71 had completed the pre-year assessment and were included in paired analyses (see Table 6 for demographic data of mid-year participants). A total of 51 interns had rotated through ACE by the mid-point assessment compared to 20 who had not yet rotated through ACE. Results are shown in Fig. 2. Overall, interns scored significantly higher on the mid-year CE compared to their pre-year performance (intern mid-year mean 67.0 [SD 10.84], intern pre-year mean 62.42 [SD 10.31], p = 0.002). This difference was accompanied by an increase in auditory scores (intern mid-year auditory mean 78.62 [SD 13.27], intern pre-year auditory mean 73.96 [SD 12.81], p = 0.011). The intern mid-year overall score was also significantly higher than the pre-year PGY-2 performance (intern mid-year mean for all tests, 66.74 [SD 10.77], PGY-2 pre-year mean 59.47 [SD 11.95], p = 0.012). Intern mid-year visual scores and integration scores were also significantly higher than PGY-2 pre-year scores (Intern mean for all tests 80.28 [SD 13.47], PGY-2 pre-year mean 74[SD 17.32], p = 0.004 for visual scores; intern mid-year mean for all tests 80.58 [SD 10.58], PGY-2 pre-year mean 76.60 [SD 12.40], p = 0.026 for integration scores).
ACE interns’ mid-year overall scores were significantly higher than PGY-2 pre-year scores (ACE mid-year overall mean 67.71 [SD 11.04], PGY-2 pre-year overall mean 59.47 [SD 11.95], p = 0.027). Non-ACE mid-year overall scores were not significantly higher than PGY-2 pre-year scores (non-ACE mid-year overall mean 64.45 [SD 9.97], PGY-2 pre-year overall mean 59.47 [SD 11.95], p = 0.186). In paired analyses using data from interns who completed the pre- and mid-year assessments, only ACE interns had a significant increase in their overall score at the mid-year assessment (ACE mid-year overall mean 67.71 [SD 11.04], ACE pre-year overall mean 60.90 [SD 11.38], p = 0.019). ACE interns also had a significant increase in auditory and visual scores (ACE mid-year auditory mean 78.65 [SD 12.31], ACE pre-year auditory mean 72.67 [SD 12.45], p = 0.014; ACE mid-year visual mean 82.17 [SD 13.08], ACE pre-year visual mean 73.78 [SD 16.87], p = 0.034).
In a multilinear model using generalized estimating equations, ACE interns had post-ACE scores that were on average 4.98 points higher than their pre-ACE score (p = 0.001). Holding all other factors constant, ACE interns in the second year of the curriculum had scores that were on average 2.88 points lower than interns who participated in the first year (p = 0.0077). ACE categorical interns had scores that were on average 5.95 points higher than non-categorical interns (p = 0.0098). ACE pre-score was a significant predictor of post-test score (p < 0.001). Non-ACE interns had post-test scores that were on average 5.4 points higher than their pre-year scores, but this difference was not significant. Non-ACE pre-score was a significant predictor of post-test score (See Table 7).
In an adjusted multivariate linear regression model, two variables significantly predicted overall change in score for all interns: pre-test score and categorical status. The difference between the pre and post-test scores became smaller for every unit increase in pre-test score (p < 0.001). Categorical interns had a change in score that was on average 7.61 points greater that non-categorical interns (p = 0.0185). Participation in the ACE curriculum was not a significant predictor of change in score in this model (see Table 8).
In the current academic medical center, many factors pull physicians, particularly trainees, away from the bedside. As physicians spend less time with patients, opportunities to practice and teach core skills such as the physical exam have declined . This has contributed to a decline in skill among both trainees and practitioners, and contributes to diagnostic error . While technological advances have dramatically improved our ability to diagnose disease, the physical exam still outperforms technology in a number of important instances . In addition to its diagnostic importance, the physical exam is a vital component of the patient-physician relationship . It helps to build rapport and trust, and if performed properly, can even contribute to a patient’s overall sense of well-being. Spending time at the bedside of patients also provides meaning for a physician’s work. The alarming rise in physician burnout in recent years may partly reflect this shift away from the bedside [34, 35]. There is a growing international movement to bring physicians and trainees back to the bedside . The present study is a direct outgrowth of that movement, and supports the hypothesis that a dedicated curriculum coupled with regular practice improves physical diagnosis skills.
It was encouraging that both interns and PGY-2s strongly agreed that improving their physical examination skills was an important goal for the next year of their training. This suggests that the introduction of the ACE curriculum was a timely intervention for the residency program. Interestingly, interns and PGY-2s reported a similar level of confidence in their ability to perform a cardiac and pulmonary exam, but reported differences in confidence surrounding specific maneuvers such as auscultation of murmurs and the jugular venous pulse examination. The fact that PGY-2s in this study were more confident on certain exam skills than incoming interns but performed at the same level on the CE might reflect a lack of emphasis on physical diagnosis teaching and practice prior to the start of the ACE curriculum. This gap between confidence and competence is an important one to address for trainees, as it directly impacts the safety and quality of patient care [37, 38].
Interns who participated in ACE had improved confidence in their physical exam skills. Interns who had not yet participated in ACE had diminished confidence compared to their peers but also compared to the start of internship. This might reflect a realization that non-ACE interns had not received adequate physical examination training to that point in their careers. The areas in which confidence differed between interns and PGY-2s as well as between ACE and non-ACE interns tended to be areas of the exam that are more technically challenging (e.g. distinguishing ‘a’ waves from ‘v’ waves). This might provide some insight into the design of future curricula that focus on more difficult exam maneuvers and techniques.
The finding that ACE interns had a significant improvement in their mid-year CE scores provides evidence that the curriculum was effective in improving exam skills. The fact that ACE interns had a significant improvement in their mid-year CE scores compared to PGY-2s who had completed intern year prior to ACE, further suggests that the ACE curriculum was more effective than the previous approach to physical exam teaching in our residency program. Since most training programs utilize the “traditional” approach to physical exam teaching, this could inform the development of similar curricula at other institutions.
In the stratified multilinear models with generalized estimating equations, only ACE interns had a significant increase in their post-test scores. However, in our regression model that combined both ACE and non-ACE interns, ACE participation did not significantly predict change in CE score. Since this was a not a randomized trial, but a quasi-randomized study based on pre-determined intern schedules, other factors may have limited our ability to see a significant effect of the ACE curriculum in the final regression model. It is not surprising that categorical status predicted a higher change in score, since categorical interns spend more time on the internal medicine services during their intern year. It is also not surprising that pre-year score was a predictor of change in score. Lower performers at the outset had more room to improve compared to those interns with a higher pre-year score.
This study has several limitations. This was a single center experience so the results may not be applicable to other institutions. However, the fact that interns and residents from our program performed similarly to other internal medicine residents as reported in the literature before exposure to ACE, suggests that our trainees were starting from a similar knowledge and skill level before introduction to the curriculum.
Our survey, while developed and reviewed by faculty who are experts in the physical exam, has only been used on a single population and lacks robust validity evidence. It was not possible to correlate individual responses with CE performance as the survey was anonymous.
We captured nearly 100% of the interns on the initial assessments, but only 60% and 44% of PGY-2s from 2015–2016 participated in the survey and CE, respectively. We also lost several interns at the mid-year assessments since their clinical schedules made it difficult for some interns to participate. This may have led to our final analyses being underpowered to detect differences in overall scores, particularly for the non-ACE interns. Non-categorical residents were over-represented in the non-ACE population at the mid-year point of the year, which may have also affected our results.
Interns and teaching attendings rotate with one another on other services. It is likely that exam skills emphasized during the general medicine rotation where ACE is delivered were taught to non-ACE interns by faculty and peers. This spillover effect probably increased intern skills independent of ACE participation, and is a recognized benefit of the current residency training model. Since the ACE curriculum was not blinded to participants, it is possible that the survey results were biased by the expectation that participation in a dedicated physical exam curriculum would improve confidence and skill.
Familiarity with the Blaufuss software used during ACE may have allowed ACE interns to score higher on the mid-year assessment. Since the Blaufuss software was used only once per rotation, it seems unlikely that this played a major role in the final results. It is also possible that taking the CE a second time might have led to increased scores independent of participation in ACE. However, in previous studies using the CE, control groups did not have improvements in scores upon taking the test a second time .
It is possible that the improvements in ACE intern performance could have been due to other aspects of the general medicine teaching service that were not part of the ACE curriculum. Rounds on the ACE service were similar to rounds on other hospitalist-led services. Aside from the ACE curriculum, the teaching activities on that service were shared activities with other rotations such as daily noon conference and intern report. Hospitalist attendings rotated through the ACE service as well as the other hospitalist teaching services. The patient population on the ACE service was similar to other general medicine services. As a result, it seems more likely that any changes in ACE intern performance were due to the ACE curriculum.
Implementation of a bedside cardiopulmonary physical diagnosis curriculum improved the attitudes, confidence and skill of interns in the cardiopulmonary examination. The effectiveness of the ACE curriculum can inform further improvements in physical exam teaching and assessment, and lay the groundwork to examine the effect of these interventions on important metrics such as test utilization, cost of care, patient and provider satisfaction, diagnostic error, and clinical outcome.
Advancing Bedside Cardiopulmonary Physical Exam Skills
Accreditation Council for Graduate Medical Education
Electronic health record
Jugular venous pressure
Post-graduate year 1 (intern)
Post-graduate year 2
- USMLE – CS:
United Stated Medical Licensing Examination – Clinical Skills
Thayer WS. Osler and other papers. Baltimore: The Johns Hopkins press; 1931.
Block L, Habicht R, Wu A, et al. the Wake of the 2003 and 2011 Duty Hours Regulations, How Do Internal Medicine Interns Spend Their Time? J Gen Intern Med. 2013;28(8):1042–7.
Lurie N, Rank B, Parenti C, Woolley T, Snoke W. How Do House Officers Spend Their Nights? N Engl J Med. 1989;320(25):1673–7.
Parenti C, Lurie N. Are things different in the light of day? A time study of internal medicine house staff days. Am J Med. 1993;94(6):654–8.
Mamykina LVD, Hripcsak G. How Do Residents Spend Their Shift Time? A Time and Motion Study With a Particular Focus on the Use of Computers. Acad Med. 2016;91(6):827–32.
Reed DA, Levine RB, Miller RG. Effect of residency duty-hour limits: Views of key clinical faculty. Arch Intern Med. 2007;167(14):1487–92.
Desai SV, Feldman L, Brown L. Effect of the 2011 vs 2003 duty hour regulation-compliant models on sleep duration, trainee education, and continuity of patient care among internal medicine house staff: A randomized trial. JAMA Intern Med. 2013;173(8):649–55.
Elder A, Chi J, Ozdalga E, Kugler J, Verghese A. THe road back to the bedside. JAMA : the journal of the American Medical Association. 2013;310(8):799–800.
Verghese A. Culture Shock - Patient as Icon, Icon as Patient. N Engl J Med. 2008;359(26):2748–51.
Verghese A, Brady E, Kapur CC, Horwitz RI. The Bedside Evaluation: Ritual and Reason. Ann Intern Med. 2011;155(8):550–3.
Chi J, Verghese A. Clinical education and the electronic health record: The flipped patient. JAMA : the journal of the American Medical Association. 2014;312(22):2331–2.
Ouyang D, Chen JH, Hom J, Chi J. Internal medicine resident computer usage: An electronic audit of an inpatient service. JAMA Intern Med. 2016;176(2):252–4.
Johnson JE, Carpenter JL. Medical house staff performance in physical examination. Arch Intern Med. 1986;146(5):937–41.
St Clair EWOE, Waugh RA, Corey GR, Feussner JR. Assessing housestaff diagnostic skills using a cardiology patient simulator. Ann Intern Med. 1992;117(9):751–6.
Mangione S, Nieman LZ, Gracely E, Kaye D. The teaching and practice of cardiac auscultation during internal medicine and cardiology training: A nationwide survey. Ann Intern Med. 1993;119(1):47–54.
Mangione S, Nieman LZ. Cardiac auscultatory skills of internal medicine and family practice trainees: A comparison of diagnostic proficiency. JAMA. 1997;278(9):717–22.
Ramani S, Ring BN, Lowe R, Hunter DA. Pilot Study Assessing Knowledge of Clinical Signs and Physical Examination Skills in Incoming Medicine Residents. Journal of Graduate Medical Education. 2010;2(2):232–5.
Gonzalo JD, Masters PA, Simons RJ, Chuang CH. Attending Rounds and Bedside Case Presentations: Medical Student and Medicine Resident Experiences and Attitudes. Teaching and learning in medicine. 2009;21(2):105–10.
Mazotti LA, Vidyarthi AR, Wachter RM, Auerbach AD, Katz PP. Impact of duty-hour restriction on resident inpatient teaching. J Hosp Med. 2009;4(8):476–80.
Vukanovic-Criley JM, Criley S, Warde C. Competency in cardiac examination skills in medical students, trainees, physicians, and faculty: A multicenter study. Arch Intern Med. 2006;166(6):610–6.
Vukanovic-Criley JM, Boker JR, Criley SR, Rajagopalan S, Criley JM. Using Virtual Patients to Improve Cardiac Examination Competency in Medical Students. Clin Cardiol. 2008;31(7):334–9.
Vukanovic-Criley JM, Hovanesyan A, Criley SR, et al. Confidential Testing of Cardiac Examination Competency in Cardiology and Noncardiology Faculty and Trainees: A Multicenter Study. Clin Cardiol. 2010;33(12):738–45.
Reilly BM. Physical examination in the care of medical inpatients: an observational study. Lancet. 2003;362(9390):1100–5.
Verghese A, Charlton B, Kassirer JP, Ramsey M, Ioannidis JPA. Inadequacies of Physical Examination as a Cause of Medical Errors and Adverse Events: A Collection of Vignettes. Am J Med.
Kugler J, Verghese A. The Physical Exam and Other Forms of Fiction. J Gen Intern Med. 2010;25(8):756–7.
Ofri D. The Physical Exam as Refuge. The New York Times. July 10, 2014, 2014;Well.
United States Medical Licensing E. USMLE Step 2 CS. 2015.
Pfuntner A, Wier LM, Stocks C. Most Frequent Conditions in U.S. Hospitals. In: 2010: Statistical Brief #148; 2013.
Kern DE, Thomas PA, Hughes MT. Curriculum Development for Medical Education: A Six-Step Approach, vol. 2. Baltimore: Johns Hopkins University Press; 2009.
McGee S. Evidence-Based Physical Diagnosis. Philadelphia, PA: Elsevier; 2012.
Seidel's Guide to Physical Examination. St. Louis, MO: Mosby; 2015.
Russell SG. Brian. The Other Sylvian Fissure: Exploring the Divide Between Traditional and Modern Bedside Rounds. South Med J. 2016;109(12):3.
Edlow JA, Newman-Toker D. Using the Physical Examination to Diagnose Patients with Acute Dizziness and Vertigo. The Journal of Emergency Medicine. 2016;50(4):617–28.
DM H, KL R. K N, AN K, LDA J. "Back to Bedside": Residents' and Fellows' Perspectives on Finding Meaning in Work. J Grad Med Educ. 2017;9(2):5.
Rosenthal DI, Verghese A. Meaning and the Nature of Physicians’ Work. N Engl J Med. 2016;375(19):1813–5.
Elder AV, Medicine AB. Back to the Future? South Med J. 2016;109(12):2.
Marel GM, Lyon PM, Field MJ, Barnsley L, Hibbert E, Parise A. Clinical skills in early postgraduate medical trainees: patterns of acquisition of confidence and experience among junior doctors in a university teaching hospital. Med Educ. 2000;34(12):1013–5.
Barnsley L, Lyon PM, Ralston SJ, et al. Clinical skills in junior medical officers: a comparison of self-reported confidence and observed competence. Med Educ. 2004;38(4):358–67.
The authors would like to thank John Michael Criley, MD, Stuart R. Criley, MBA and Jasminka M. Vukanovic-Criley, MD for their support in obtaining the cardiovascular assessment tool and interactive software from Blaufuss Medical Multimedia Laboratories, and their advice on how to incorporate the software into the curriculum. The authors would also like to thank the interns, residents, hospitalist attendings, nurses, administrators and patients from the general medicine service for participating in the curriculum and for contributing to its continued improvement.
Brian T. Garibaldi, MD has received funding from the Berkheimer Faculty Education Scholar Award from the Johns Hopkins Institute for Excellence in Education (IEE) as well as the Jeremiah Barondess Fellowship in the Clinical Transaction from the New York Academy of Medicine (NYAM) and the Accreditation Council for Graduate Medical Education (ACGME).
The IEE, NYAM and ACGME did not play a role in the design of the study or the collection, analysis, and interpretation of data.
Availability of data and materials
The datasets used and/or analyzed during the current study are available from the corresponding author on reasonable request.
BG is the associate program director for the Osler Medical Residency Program at The Johns Hopkins Hospital. He is co-president and founder of the Society of Bedside Medicine (SBM), a newly formed organization dedicated to teaching, research and innovation in the bedside encounter.
Ethics approval and consent to participate
A Johns Hopkins Medicine Institutional Review Board designated the project as “Quality Improvement” (IRB00090910), and the need for ethical approval was waived. Only verbal consent was obtained from survey participants. Written consent was not obtained based on the IRB designation as “Quality Improvement”. The survey was administered online and was completely anonymous.
Consent for publication
The authors declare that they have no competing interests.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.