Skip to main content

Comparison of self and simulated patient assessments of first-year medical students’ Interpersonal and Communication Skills (ICS) during Objective Structured Clinical Examinations (OSCE)

Abstract

Background

Interpersonal and communication skills (ICS) are important core competencies in medical education and certification. In this study, we identified self- and simulated patient (SP)-reported ratings of US first-year medical students’ ICS and the influence of age and gender on performance appraisal during the Objective-Structured Clinical Examination (OSCE).

Methods

OSCE participants, including 172 first-year medical students and 15 SPs were asked to evaluate the students’ ICS using the American Board of Internal Medicine–Patient-Satisfaction Questionnaire (ABIM–PSQ), electronically and via paper, respectively. Self- and SP-reported ratings of students’ ICS were presented as the median on a 5-point Likert-scale and as three categories defined as “good,” “very good,” and “inadequate.”

Results

SPs assessed all 172 students in the OSCE, while 43.6% of students assessed their own performance. The majority of students and SPs evaluated the students’ ICS as very good. 23.3% of SPs and 5.3% of students rated the medical students’ ability to encourage patient question-asking and answer questions as inadequate (P <  0.002). Neither age nor gender influenced the medical students’ self-assessment of ICS. Female SPs assigned lower scores to students in regard to respecting patients and encouraging patient question-asking and answering. Older SPs was more likely to assign lower scores on all survey questions.

Conclusions

In the present study, self- and SP-reported ratings of first-year medical students’ ICS were mainly “very good” with no influence of students’ age or gender. Older age and female gender among the SPs were associated with a reduction in SP-reported ratings of students’ ICS.

Peer Review reports

Background

The demonstration of effective interpersonal and communication skills (ICS) is one of the core competencies in both pre-and post-graduate medical education, [1] as well as physician certification [2]. Since its introduction in 1975, the Objective-Structured Clinical Examination (OSCE) has been generally accepted for measurement of medical students’ ICS [3,4,5]. Multiple studies have acknowledged the inadequate accuracy of existing tools in assessing medical students’ ICS during OSCEs [6,7,8,9]. For instance, a recent systematic review comments on the insufficient psychometric properties of 8 rating scales used in the assessment of medical students’ ICS during OSCEs [6]. No consensus exists among institutions on the employment of faculty examiners, simulated patients (SP), or both to evaluate medical students’ ICS [7]. An analysis of various checklists used in OSCEs identified fair to moderate agreement between raters in the assessment of medical students’ ICS, making comparisons of medical students within and across institutions difficult [7]. The quality of the different scales used in the OSCE is questionable, possibly because an acceptable level of adequate physician-patient communication has not been defined [10]. SPs participate in OSCEs and are an integral part of the curriculum in medical education [11]. Since there is an acceptable level of agreement between faculty examiners and SPs in the assessment of medical students’ ICS, SPs satisfaction scores may be a reliable indicator of medical students’ ICS [8]. Furthermore, the self-assessment of ICS by medical students has been identified as an important component in medical education and the development of self-directed learning skills [ 9, 12]. One study that compared medical students’ self-reported and SP- and observer-assigned OSCE checklist rating during an OSCE demonstrated that students scored their communication skills lower compared to observers or SPs in 2 out of 12 categories [13]. A study from Norway that used video recordings demonstrated poor concordance between self-reported scores by graduating medical students and SP-assigned scores in regard to ICS [9]. From the perspective of first-year medical students, reviewing video recordings of their interactions with SPs to self-assess their ICS was feasible, practical and informative [12]. Obtaining a better understanding about the ICS of medical students with limited clinical exposure could be essential for adjusting curricula to ensure appropriate ICS development prior to starting clinical rotations. Another recent study identified a need for the early introduction of simulation-based training to develop ICS in preclinical medical curricula [14]. First-year medical students have significantly lower positive attitudes toward ICS training and perceived confidence about communicating with patients than fourth-year medical students [15]. To the best of our knowledge, no published studies have investigated the differences between and influencing factors associated with ICS ratings provided by first year medical students themselves and SPs. The purpose of this study was to compare the self-reported and SP-assigned ratings of first-year medical student’ ICS during an OSCE and determine the influence of age and gender on performance appraisal. We utilized five questions from the American Board of Internal Medicine–Patient Satisfaction Questionnaire (ABIM-PSQ) [16] that were previously used to assess medical students’ ICS by the actual clinic patients in Japan [17, 18]. We assumed that ABIM-PSQ that was designed to survey actual patients could provide valuable information regarding ICS of medical students to explore the direction for improvement prior to clinical exposure. This research is the first that uses the ABIM-PSQ to identify and analyze SP-given ratings of first-year medical students’ ICS and self-assessed by students who participated in OSCE.

Methods

In this cross-sectional study, we compared the self-reported and SP-assigned ratings of first-year medical student’ ICS and evaluated the effect of demographic covariates (age and gender) on the medical students’ and SPs’ responses. The survey protocol was approved as exempt from full review by the Rutgers Health Sciences Institutional Review Board because the investigation is based on the anonymous responses from the first-year medical students and SPs who participated in the OSCE (Protocol #2018002140).

OSCE setting

The simulated clinical encounter was conducted in an OSCE setting with a 15-min time allotment for the medical student–SP interview. At our institution, the first-year medical students’ ICS in OSCEs are measured only by faculty. OSCE in the first-year medical students do not include ICS scoring by SPs as well as students themselves.

Survey instruments

The ABIM–PSQ is a reliable and validated tool for evaluating global communication skills in physicians [16]. We used five questions (Table 1) [17, 18] from the original 10-item ABIM–PSQ because first-year medical students are only responsible for eliciting a patient history and performing a basic physical examination on SPs during the OSCE. Previous studies have suggested that the reliability of the ABIM–PSQ is not compromised as long at least five items from the original 10-item questionnaire are scored [16, 18]. As shown in Table 1, three questions measure patient-centered humanistic behavior, including greeting and friendliness (Q1), respect for patients (Q2), and careful listening (Q4). Other questions measure the personal interest displayed towards the SP (Q3) and encouragement of patient question-asking and answering questions (Q5). We used a 5-point Likert scale, where poor = 1, fair = 2, good = 3, very good = 4, and excellent = 5, to evaluate the medical students’ ICS according to each question-based statement. Finally, we asked medical students and SPs to indicate their age and gender. The questionnaire was uniform for medical students as well as SPs.

Table 1 Items selected from the ABIM–PSQ [17, 18]

Survey implementation

The questionnaire included informed consent that explained the goal and voluntary nature of this anonymous survey study along with the analogous questionnaires to the medical students and SPs electronically and via paper, respectively. We surveyed 172 first-year medical students and 15 SPs who had participated in the second OSCE of the first-year medical school curriculum but did not directly assess students’ ICS because at our institution, first-year medical students’ ICS in OSCEs are measured by faculty physicians. The member of the research team who distributed and collected the survey responses from the SPs did not participate in any OSCE-related activities. Moreover, no prior knowledge about the study was provided to SPs and medical students. Medical students and SPs were oriented to the expectations of the OSCE but did not have prior knowledge of the survey questions.

We provided an anonymous link to the survey questionnaire (https://rutgers.qualtrics.com) to the medical students on the day of the OSCE. Reminder messages were sent to all of the first-year medical students at 1 and 2 weeks from the original request. The SPs were asked to complete the paper-and-pencil survey for each student after testing during the OSCE, anonymously. Therefore, the self-assigned and SP-given ratings of first-year medical students’ ICS during the OSCE were not linked. The responses from SPs were collected at the end of the OSCE.

Statistical analysis

We used the Mann-Whitney U test to compare the median scores on a 5-point Likert-scale, for the responses to each survey question, between the first-year medical students and SPs. Gender-driven differences in responses by medical students and SPs were calculated. We also used the χ2 test to compare scores on a 3-point Likert-scale, where responses were categorized based on a previous study [19] as “very good” if the medical students or SPs responded “excellent” or “very good”, “inadequate” if the medical students or SPs responded “poor” or “fair,” and “good” if the medical students or SPs ranked their answer as “good.” We also conducted a regression analysis to control the responses for the ages and genders of study participants (medical students and SPs). Categorical data are presented as percentages (%) and continuous data as means and standard deviations (SD), while Likert scale data are summarized by the medians and interquartile ranges (IQR). Regression coefficients (β) and odds ratios (OR) including 95% Confidence Interval (95%CI) were also used to present the results of the study. We used Statistica 13.2 for Windows (StatSoft Inc., Tulsa, OK) to analyze the data. A two-tailed P-value of less than 0.05 was considered statistically significant.

Results

Among the 172 first-year medical students who were surveyed, 75 (43.6%) responded. SPs had evaluated between 9 and 15 first-year medical students (median = 12, IQR = 1) each. All respondents (first-year medical students and SPs) had completed the full questionnaire, except for one female medical student who did not identify her age. The ages of the first-year medical students and SPs ranged from 21 to 33 years (23.6 ± 2.1 years) and 25 to 60 years (46.0 ± 12.0 years), respectively (P <  0.0001). Male subjects constituted 40% of first-year medical students and 47.1% of SPs (P = 0.30). The median responses between the medical students and SPs using the 5-point Likert (Fig. 1) and 3-point Likert-scale (Table 2) were comparable except for question 5. As shown in Tables 2, 23.3% of the SPs assessed the medical students’ performance on question 5 as “inadequate” compared to 5.3% of medical students (P < 0.002).

Fig. 1
figure1

Comparison of median responses between medical students and simulated patients using a 5-point Likert scale

Table 2 Comparison of medical students’ and simulated patients’ responses (% and 95 CI)

Demographic characteristics and medical students’ and SPs’ responses

The medical students’ self-assessment scores were not associated with age. Gender did not influence the medical students’ responses (eTable 1 in Supplementary Material). However, female SPs were more likely to assign lower scores to the medical students on questions 2 and 5 as compared to their male counterparts (Fig. 2).These findings persisted even after controlling for the age of the SPs (Q2: OR 0.85, 95% CI 0.72–0.97 and Q5: OR 0.84, 95% CI 0.74–0.99). Irrespective of gender, older SPs were more likely to assign lower scores to the medical students on all survey questions as compared to their younger counterparts (Table 3).

Fig. 2
figure2

Gender differences in simulated patients’ responses to Q2 (a) and Q5 (b)

Table 3 Association of simulated patients’ ages with responses (P < 0.0001)

Discussion

This study found that the majority of first-year medical students and SPs evaluated the medical students’ greeting and friendliness, respect for patients, personal interest displayed towards the SP and careful listening during the OSCE as “very good.” However, students were more likely to overestimate their ability to encourage patient question-asking and answering questions. Neither age nor gender influenced the students’ self-assessment of their ICS. On the other hand, older SPs were more likely to assign lower scores to students. Furthermore, the female SPs assigned scores that were nearly 25% lower than the male SPs to medical students in regard to respecting patients and encouraging patient question-asking and answering questions. The discussion of findings in our study is limited. Previous studies have not investigated self-reported and SP-assigned ratings of first-year medical student’ ICS during an OSCE using the ABIM-PSQ. Only one study has reported self-assessed strengths and weaknesses of ICS by first-year medical students after viewing video recordings of their interactions with SPs [12]. More than 50% of students identified their ability to elicit information/cover important topics and personal connection/rapport as strengths but few students recognized weaknesses in their ICS, especially in non-verbal communication such as paralanguage, kinesics and facial expression. Several studies have discussed the ICS of medical students in their clinical years of medical school and echo the findings of our study. A study from Japan that used the ABIM-PSQ questionnaire demonstrated that actual patients were more likely to assign low scores to medical students during their clinical rotations in regard to encouraging patient question-asking and answering questions [18]. Meta-analyses that included 35 articles on medical student self-assessment accuracy defined with correlation, paired or independent means comparison revealed the tendency for students to overestimate their communication skill rather than knowledge-based assessments [20]. Unfortunately, age- and gender-based analyses of medical students’ self-reported ICS were rarely reported [20]. A longitudinal study from Germany demonstrated that female medical students in their 6th semester had higher self-reported empathy scores than their male counterparts. In the same study, female medical students were rated higher than their male counterparts by SPs on all dimensions tested during the OSCE, which included empathy, content structure, verbal expression, and non-verbal expression [21]. Another study from Japan found that observers assigned higher ratings of ICS to female fifth-year medical students compared to their male counterparts [22]. Furthermore, a study from the U.S. reported that SPs assigned significantly higher ratings of empathy displayed during an OSCE to female third- and fourth-year medical students compared to their male counterparts, irrespective of gender and ethnicity [23]. Berg et al. [24] found that female third-year medical students received higher scores on all three measures of empathy during an OSCE compared to their male counterparts. We found no studies that investigated the role of SP age of the assessment of medical students’ ICS.

This study has several limitations, including the external validity of findings from a sample of a single medical school. However, the distribution of genders in our study sample is comparable to that of all US first-year medical students [25]. There is also a risk of response bias since only 43.7% of first-year medical students responded to the survey, even though the distributions of the respondents’ ages and genders were comparable to the first-year medical student class at-large. Although complete anonymity increases response validity, it also could decrease the motivation to answer questions accurately [26, 27]. Since anonymity was preserved, we were not able to perform a paired analysis and instead relied on comparing the independent median scores, which is a statistically weaker measure of self-assessment accuracy [20]. Categorical data could improve the accuracy of the findings in our study [ 20]. However, determining the accuracy of our ICS measurements was not one of the goals of our study. We also recognize that collecting racial and ethnic data may be important for understanding the expectations of a culturally diverse patient population in regard to the ICS of future physicians. Moreover, we did not collect data as to whether the medical students successfully finished their SP encounters within the 15-min time limit. In a study of OSCE perspectives, third- and final-year medical students reported that the time allotted for OSCE stations involving medical interviews was insufficient. It is difficult to comment on the influence of time constraints on ICS because the OSCEs in these studies consisted of multiple stations that assessed clinical skills in addition to ICS [28, 29]. Nevertheless, the first-year medical students in our study may not have had the opportunity to encourage question-asking and answer questions due to time constraints. As a result, the disparity between self-assigned and SP-given ratings of performance in this domain may reflect an inability to finish the encounter on-time instead of a deficit in ICS.

Conclusions

Despite the fact that most of the self-reported and SP-given ratings of first-year medical students’ ICS were “very good,” up to one-third of SPs defined the first-year medical students’ ability in encouraging patient question-asking and answering questions as “inadequate.” The influence of SP older age and female gender on the reduction of scores of medical students’ ICS, particularly in regard to encouraging patient question-asking and answering questions has been demonstrated. The findings of this study may have important implications for medical education, including curriculum development to ensure that first-year medical students are prepared to facilitate patient-centered communication and shared treatment-decision making during their clinical rotations [30]. Therefore, teaching pre-clinical medical students to encourage patient question-asking and answer questions may be important in advancing the ICS of future physicians.

Availability of data and materials

The datasets used and/or analyzed during the current study are available from the corresponding author on reasonable request.

Abbreviations

OSCE:

Objective Structured Clinical Examination(s)

SP:

Standardized Patient(s)

ICS:

Interpersonal and Communication Skills

ABIM-PSQ:

American Board of Internal Medicine–Patient Satisfaction Questionnaire

RWJMS:

Rutgers–Robert Wood Johnson Medical School

SD:

Standard Deviation(s)

OR:

Odds Ratio(s)

IQR:

Interquartile Range(s)

CI:

Confidence Interval(s)

References

  1. 1.

    Epstein RM, Hundert EM. Defining and assessing professional competence. JAMA. 2002;287:226–35.

    Article  Google Scholar 

  2. 2.

    Duffy FD, Gordon GH, Whelan G, et al. Assessing competence in communication and interpersonal skills: the Kalamazoo II report. Acad Med. 2004;79:495–507.

    Article  Google Scholar 

  3. 3.

    Harden RTM, Stevenson M, Downie WW, et al. Assessment of clinical competence using objective structured examination. Br Med J. 1975;1(5955):447–51.

    Article  Google Scholar 

  4. 4.

    Hodges B. OSCE! Variations on a theme by Harden. Med Educ. 2003;37:1134–40.

    Article  Google Scholar 

  5. 5.

    Turner JL, Dankoski ME. Objective structured clinical exams: a critical review. Fam Med. 2008;40:574–8.

    Google Scholar 

  6. 6.

    Cömert M, Zill JM, Christalle E, et al. Assessing communication skills of medical students in objective structured clinical examinations (OSCE) - a systematic review of rating scales. PLoS One. 2016;11:e0152717.

    Article  Google Scholar 

  7. 7.

    Setyonugroho W, Kennedy KM, Kropmans TJ. Reliability and validity of OSCE checklists used to assess the communication skills of undergraduate medical students: a systematic review. Patient Educ Couns. 2015;98:1482–91.

    Article  Google Scholar 

  8. 8.

    Gude T, Grimstad H, Holen A, et al. Can we rely on simulated patients' satisfaction with their consultation for assessing medical students' communication skills? A cross-sectional study. BMC Med Educ. 2015;15:225.

    Article  Google Scholar 

  9. 9.

    Gude T, Finset A, Anvik T, et al. Do medical students and young physicians assess reliably their self-efficacy regarding communication skills? A prospective study from end of medical school until end of internship. BMC Med Educ. 2017;17:107.

    Article  Google Scholar 

  10. 10.

    Deveugele M, Derese A, Maesschalck SD, Willems S, Van Driel M, De Maeseneer J. Teaching communication skills to medical students, a challenge in the curriculum? Patient Educ Couns. 2005;58:265–70.

    Article  Google Scholar 

  11. 11.

    Shirazi M, Labaf A, Monjazebi F, Jalili M, Mirzazadeh M, Ponzer S, et al. Assessing medical students’ communication skills by the use of standardized patients: emphasizing standardized patients’ quality assurance. Acad Psychiatry. 2014;38:354–60.

    Article  Google Scholar 

  12. 12.

    Zick A, Granieri M, Makoul G. First-year medical students’ assessment of their own communication skills: a video-based, open-ended approach. Patient Educ Couns. 2007;68:161–6.

    Article  Google Scholar 

  13. 13.

    Ammentorp J, Thomsen JL, Jarbol DE, et al. Comparison of the medical students’ perceive self-efficacy and the evaluation of the observers and patients. BMC Med Educ. 2013;13:49.

    Article  Google Scholar 

  14. 14.

    Nuzzo A, Tran-Dihn A, Courbebaisse M, et al. Improved clinical communication OSCE scores after simulation-based training: results of a comparative study. PLoS One. 2020;15:e0238542.

    Article  Google Scholar 

  15. 15.

    Wright KB, Bylund C, Ware J, et al. student attitudes toward communication skills training and knowledge of appropriate provider-patient communication: A comparison of first-year and fourth-year medical students. Med Edu Online. 2006;11:1 4594.

    Google Scholar 

  16. 16.

    PSQ Project Co-Investigators. Final report on the patient satisfaction questionnaire project. Philadelphia: American Board of Internal Medicine; 1989.

    Google Scholar 

  17. 17.

    Oda Y, Onishi H, Yamashiro S, et al. The assessment of undergraduate curriculum of communication skills evaluated by performance measurement using actual outpatient satisfaction. Gen Med. 2003;4:1–6.

    Article  Google Scholar 

  18. 18.

    Oda Y, Onishi H, Sakemi T, et al. Improvement in medical students’ communication and interpersonal skills as evaluated by patient satisfaction questionnaire after curriculum reform. J Clin Biochem Nut. 2014;55:14–29.

  19. 19.

    Abadel FT, Hattab AS. Patients’ assessment of professionalism and communication skills of medical graduates. BMC Med Educ. 2014;14:28.

    Article  Google Scholar 

  20. 20.

    Blanch-Hartigan D. Medical students' self-assessment of performance: results from three meta-analyses. Patient Educ Couns. 2011;84:3–9.

    Article  Google Scholar 

  21. 21.

    Gaf J, Smolka R, Simoes E, et al. Communication skills of medical students during the OSCE: gender-specific differences in a longitudinal trend study. BMC Med Educ. 2017;17:75.

    Article  Google Scholar 

  22. 22.

    Sugawara A, Ishikawa K, Motoya R, et al. Characteristics and gender differences in the medical interview skills of Japanese medical students. Intern Med. 2017;56:1507–13.

    Article  Google Scholar 

  23. 23.

    Berg K, Blatt B, Lopeiato J, et al. Standardized patient assessment of medical student empathy: ethnicity and gender effects in multi-institutional study. Acad Med. 2015;90:105–11.

  24. 24.

    Berg K, Majdan JF, Berg D, et al. Medical students’ self-reported empathy and simulated patients’ assessments of student empathy: an analysis by gender and ethnicity. Acad Med. 2011;86:984–8.

    Article  Google Scholar 

  25. 25.

    Heiser S. The majority of U.S. medical students are women, new data show. Press release. https://www.aamc.org/news-insights/press-releases/majority-us-medical-students-are-women-new-data-show. Accessed 2.29.2020.

  26. 26.

    Ong AD, Weiss DJ. The impact of anonymity of responses to sensitive questions. J Appl Soc Psychology. 2000;30:1691–708.

    Article  Google Scholar 

  27. 27.

    Lelkes Y, Krosnick JA, Marx DM, et al. Complete anonymity compromises the accuracy of self-reports. J Exp Soc Psychol. 2012;48:1291–9.

    Article  Google Scholar 

  28. 28.

    Skrzypek A, Szeliga M, Stalmach-Przygoda A, et al. The objective structured clinical examination (OSCE) from the perspective of 3rd year’s medical students - a pilot study. Folia Med Cracov. 2017;57(3):67–75.

    Google Scholar 

  29. 29.

    Maa M, Kumar A, Krishnamurthy K, et al. An evaluative study of objective structured clinical examination (OSCE): students and examiners perspectives. Adv Med Educ Pract. 2019;10:387–97.

    Article  Google Scholar 

  30. 30.

    Judson TJ, Detsky AS, Press MJ. Encouraging patients to ask questions. JAMA. 2013;309:2325.

    Article  Google Scholar 

Download references

Acknowledgements

Not applicable.

Funding

The author(s) received no specific funding for this work.

Author information

Affiliations

Authors

Contributions

JAR: Contributed to conception and design; collected the data and constructed the computerized database; contributed to data interpretation; created the manuscript; read and approved the manuscript; agrees to be accountable for all aspects of work ensuring integrity and accuracy. DC: Contributed to conception and design; read and approved the manuscript; agrees to be accountable for all aspects of work ensuring integrity and accuracy. CAT: Contributed to conception and design; read and approved the manuscript; agrees to be accountable for all aspects of work ensuring integrity and accuracy. AP: Contributed to study design; contributed to data analysis and interpretation; critically revised the manuscript; read and approved the manuscript; agrees to be accountable for all aspects of work ensuring integrity and accuracy.

Corresponding author

Correspondence to Anna Petrova.

Ethics declarations

Ethics approval and consent to participate

The Institutional Review Board of Rutgers–Robert Wood Johnson Medical School (RWJMS) approved the exempt survey study of the first-year medical students and SPs who participated in the OSCE. Written informed consent that explained the goal and voluntary nature of this anonymous survey study accompanied the analogous questionnaires that were distributed to the medical students and SPs electronically and via paper, respectively.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1 eTable 1.

Gender differences in medical students’ responses (Median and IQR).

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Roshal, J.A., Chefitz, D., Terregino, C.A. et al. Comparison of self and simulated patient assessments of first-year medical students’ Interpersonal and Communication Skills (ICS) during Objective Structured Clinical Examinations (OSCE). BMC Med Educ 21, 107 (2021). https://doi.org/10.1186/s12909-021-02540-y

Download citation

Keywords

  • Evaluation
  • Self-assessment, medical students
  • Communication skills
  • Objective structured clinical examination