Skip to main content

Do the teaching, practice and assessment of clinical communication skills align?

Abstract

Background

Evidence indicates that communication skills teaching learnt in the classroom are not often readily transferable to the assessment methods that are applied nor to the clinical environment. An observational study was conducted to objectively evaluate students’ communication skills in different learning environments. The study sought to investigate the extent to which the communication skills demonstrated by students in classroom, clinical, and assessment settings align.

Method

A mixed methods study was conducted to observe and evaluate students during the fourth year of a five-year medical program. Participants were videorecorded during structured classroom ‘interactional skills’ sessions, as well as clinical encounters with real patients and an OSCE station calling upon communication skills. The Calgary Cambridge Observational Guides was used to evaluate students at different settings.

Result

This study observed 28 students and findings revealed that while in the classroom students were able to practise a broad range of communication skills, in contrast in the clinical environment, information-gathering and relationship-building with patients became the focus of their encounters with patients. In the OSCEs, limited time and high-pressure scenarios caused the students to rush to complete the task which focussed solely on information-gathering and/or explanation, diminishing opportunity for rapport-building with the patient.

Conclusion

These findings indicate a poor alignment that can develop between the skills practiced across learning environments. Further research is needed to investigate the development and application of students’ skills over the long term to understand supports for and barriers to effective teaching and learning of communication skills in different learning environments.

Peer Review reports

Background

Doctors’ communication skills are among the most essential elements of effective patient care [1, 2]. Studies show a clear link between the quality of explanations provided to patients and their health outcomes, including reduced pain, enhanced quality of life, improved emotional health, symptom alleviation, and better adherence to treatment plans [1, 3, 4]. Recognizing its significance, medical councils and accreditation bodies worldwide prioritize effective communication as a core competency for healthcare practitioners [5,6,7]. Experts agree that communication skills can be effectively taught and learnt [8,9,10,11,12,13]. Patients treated by doctors who have undergone communication skills training exhibit 1.62 times higher treatment adherence [4]. Furthermore, doctors trained in communication have better patient interactive processes and outcomes (information gathered, signs and symptoms relieved, and patient satisfaction) compared to those without such training [14].

Diverse medical consultation models have emerged, drawing from a spectrum of frameworks that prioritize tasks, processes, outcomes, clinical competencies, doctor-patient relationship, patients’ perceptive of illness, or a combination of these elements [15]. These models serve as frameworks conducive to structuring the teaching and learning of communication skills, delineating both content and pedagogical approaches. Numerous research endeavours have assessed these models’ applicability in clinical and educational settings [16, 17]. Medical schools can leverage these models to articulate learning objectives and tailor teaching strategies within their curricula accordingly. Furthermore, the model can serve as framework for evaluating communication skills and the effectiveness of training interventions. The adoption of such models across these diverse educational contexts however, appears to be inconsistent.

Communication skills teaching and learning typically commences in classroom or simulation settings before students transition to clinical practice. Classroom sessions involve simulation-based role-play exercises with peers and simulated patients [18, 19], while clinical environments offer opportunities for real patient interactions in various healthcare settings [20]. This structured learning process aims to develop students’ ability to conduct effective, patient-centered medical consultations across diverse clinical scenarios and handle challenging situations [20, 21]. Students’ ability to communicate with patients is commonly assessed using Objective Structural Clinical Examination (OSCE) [22], in which students are observed during an interaction with simulated patient in a set time and evaluated using a standardised rating form.

Constructive alignment, a crucial concept in education, refers to “aligning teaching methods, as well as assessment, to the intended learning outcomes” [23]. In medical education, this alignment ensures that students achieve desired learning outcomes related to communication skills necessary for delivering patient-centered care in diverse contexts [24]. Communication skills include the process of exchanging messages and demonstrating empathic behaviour while interacting with patients and colleagues to deliver patient-centred care in a range of contexts [25]. Learning clinical communication skills is complex and nuanced but should be subjected to the scrutiny and planning associated with constructive alignment along with other curricula elements. Both approaches to learning, and application of clinical communication skills often requires students to be creative and flexible in applying their skills to different contexts and patient conditions [19], and to be committed to ongoing development and improvement of their communication in practice [26, 27].

The achievement of such constructive alignment, however, remains an elusive goal in many medical schools, with challenges in aligning the communication skills learnt, modelled, and applied in different learning environments and assessment contexts [28,29,30,31,32]. Evidence indicates that the skills, suggested structures, and processes learnt in the classroom are not always transferred in the clinical environment [20, 33, 34]. Differences in learning processes exist between settings, and particularly in the transition from classroom to clinical environment, associated with heavy workloads, different teaching and assessment methods, students’ uncertainty about their role, and adaptation to a more self-directed learning style [35].

Although structured approaches to medical consultations, such as the Calgary-Cambridge Observational Guides (CCOG) [19] are often taught by classroom educators, teaching, feedback and modelling in clinical environment often does not align with this [20, 36]. Further, these structures are not always reflected in the rating schemes used to assess OSCE performance [37], or the structure of OSCE stations in which communication is not the only skill being assessed. OSCE stations differ in purpose and focus from those designed to assess communication as it is integrated within broader clinical tasks, to those which focus more specifically on communication [38], but time limited OSCE stations rarely reflect the true entirety of a complex clinical task [39]. Recent reviews indicate that OSCEs remain widely used, and generally apply good assessment practices, such as blueprinting to curricula and the used of valid and reliable instruments [38, 39], but their ability to reflect authentic clinical tasks is less clear.

An observational study was conducted to objectively evaluate students’ communication skills in different learning environments. The study aimed to explore the extent to which the communication skills demonstrated by students in classroom, clinical, and assessment settings align.

Methods

Study design

A concurrent triangulation mixed methods study was conducted to observe and evaluate students during the fourth year of a five-year medical program. Concurrent triangulation designs leverage both qualitative and quantitative data collection to enhance the accuracy of defining relationships between variables of interest [40]. Video-recordings were employed during structured classroom ‘interactional skills’ sessions (ISS) or workshops, clinical encounters with patients and an OSCE station requiring communication skills. The use of video recording aimed to ensure the objectivity of data collection and eliminate researcher-participant interaction biases [41]. As communication skills comprise verbal and non-verbal behaviour, video-recording is considered the most suitable method to capture this behaviour alongside their contextual settings [42]. Additionally, field notes were taken during each observation to provide further context to during data analysis. This study was approved by both the University and Health District Human Research Ethics Committees (H-2018-0152 and 2018/PID00638).

Study sites and participants

This study was undertaken in a five-year undergraduate medical program which includes a structured communication skills curriculum grounded in the principles delineated in the Calgary-Cambridge guide to the medical interview. The curriculum is structured to initiate students into communication micro-skills in the context of classroom-based simulation early in the program, with ongoing opportunities to practise and master these throughout the program during both classroom sessions and application in clinical practice. Interactional skills workshops (classroom) occur throughout the program to align with other curricula elements and clinical rotations. Beginning in the third year, students engage in regular interaction with real patients across various clinical contexts, complemented by continued participation in scheduled interactional skills workshops within the classroom environment. For example, during the Women’s, Adolescent’s, and Children’s Health (WACH) rotation, students attend four structured classroom workshops tailored to the communication skills pertinent to specific clinical scenarios;1) Addressing sensitive issues with adolescents, 2) Partnership with parents, 3) Prenatal screening, and 4) Cervical screening discussions with Aboriginal and Torres Strait Islander women. Evaluation of communication skills is conducted using OSCEs at pivotal points throughout the program.

For the purpose of this study, all fourth-year medical students were invited to participate during their 12-week WACH rotation. This rotation spans clinical placements across five clinical schools in the medical school footprint. The WACH rotation encompasses a blended learning approach, both didactic and clinical components. Students are afforded opportunities to apply communication skills acquired in classroom settings to clinical setting and are assessed in a multi-station OSCE at the end of the rotation. Participation invitations were extended to all students actively enrolled in the course during the designated study period.

Study procedure

Students received an email invitation from the school administration at the beginning of the rotation, and the study was also briefly described during a lecture prior to clinical placement. Students who consented were observed during an interactive workshop session involving role-play with simulated patients, one real patient encounter, and one end-of-semester OSCE station related to communication skills. As part of this rotation, students attended four ISS workshops focusing on communication skills required in specific situations. They were also expected to keep a record of experience and achievement towards their core clinical competencies, including history-taking and patient communication tasks. Skills were assessed in a multiple station OSCE at the end of semester. Participating students received an AU$20 gift vouchers as appreciation for their time.

Context of the observation

In-class activities were directly observed and video-recorded, with equipment set up to record role-play interactions between consenting students and a simulated patient. Sessions included eight to twelve students, beginning with discussion of the topic before inviting students to practice skills with simulated patients.

The learning process typically commenced with an introductory overview of the topic, followed by a discussion of students’ clinical rotation experiences. Subsequently, the session advanced to simulated scenarios, wherein various students engaged in role-playing activities. The classroom facilitator initiated each role-play by delineating the scenario and ensuring that students were adequately briefed on their roles and the context before inviting volunteers to interact with the simulated patient. The length of time each student spent in the ‘hot-seat’ engaged in a role-play varied depending on the nature of the session, the facilitator style, and the section of the scenario each student was role-playing.

Clinical educators or health behaviour scientists facilitated the workshops, guided by facilitator instructions which encouraged application of agenda-led, outcome-based analysis of the role-play experiences [19]. As part of the learning process, some students started the role-play at a mid-point of the consultation, picking up from where previous students paused. They continued the conversation whenever previous students left the role. Therefore, not all micro-skills in the CCOG could be observed in every student’s role-play.

The clinical observations were scheduled for times and locations convenient for the participants, either with clinical supervisor during unstructured clinical time, or self-study, usually in Internal Medicine, Paediatrics, or Obstetrics and Gynaecology wards. In this setting, the students aimed to independently take a complete medical history from a patient. Stable and cooperative patients were identified by attending physicians or nurses who sought initial consent for students to approach them. Students also sought permission when approaching patients, to have the consultation recorded for the purpose of the study. The researcher was available to explain the study to the patients if needed. After observing a real patient encounter, a structured debriefing was conducted and recorded for used in the analysis.

One station in which communication was directly assessed (included as one or more items in the marking schema), was observed during an end-of-semester OSCE. Each student had a maximum of eight minutes to respond to a clinical task, after two-minutes reading and preparation time. Students interacted with a simulated patient and examiner based on the task given. In this study, students were observed in three end-of-semester OSCEs with three different cases. All cases related to the women’s health clinical rotation; the first case required the student to discuss a pregnancy test result with a female patient. The second asked the student to discuss contraceptive options with a teenage girl. The third case required the students to discuss urinary incontinence due to uterine prolapse with a female patient. Each simulated patient was trained to present with specific symptoms in response to the students’ questioning. The examiners observed the students and rated their performance based on pre-determined marking criteria. The OSCEs were video recorded without the researcher present.

Outcome measures/instruments

Students’ communication skills in each context were independently observed and rated using the CCOG [19] by two observers. This evaluation tool has good validity and reliability for evaluating communication skills across a range of settings [17, 19, 43], with moderate intraclass correlation coefficients for each item, ranging from 0.05 to 0.57 [43]. CCOG evaluates six essential communication skills tasks including initiating the session, gathering information, providing structure, building relationships, explanation and planning, and closing the session, and overall performance in interpersonal communication [19]. Each task consists of two to four micro-skills. Not all tasks could be applied to each observation and setting depending on the presenting complaint, the purpose of encounters, and the patient context [19]. Each student’s performance of micro-skills was evaluated using a 3-point Likert scale: “0” (did not perform the skill), “1” (skill was partially performed or not performed well), “2” (skills performed well), and “NA” (not applicable). Overall performance was evaluated using a 9-point Likert scale (1–3 = unsatisfactory, 4–6 = satisfactory, and 7–9 = superior) [44].

During observations, the researcher took field notes which included the context, time and setting of the observation, number and type of attendees (students, facilitator, simulated or real patient), how the sessions occurred, interactions among attendees, and critical reflections of the researcher.

Analysis method

This study implemented a combination of descriptive quantitative and qualitative approaches. This method uses qualitative data to support and enable a deeper understanding and interpretation of the quantitative data. A concurrent triangulation method was used to analyse data collected from observations. For quantitative data three researchers independently rated a sample of the recordings and reached agreement on ratings before scoring was completed by the lead author. This process ensured that the ratings were representative of the meaning of the task and confirmed that the rating of the data was consistent. The researchers discussed the scores to check for consistency and inter-rater reliability and Cohen’s kappa was calculated [45] as 0.88 (SE = 0.12; CI 95% = 0.65–1.00). SPSS Statistics for Windows (IBM SPSS Statistics for Windows, Version 26.0. Armonk, NY, USA) was used to calculate descriptive statistics for demographic variables and scoring. Analysis of variance was conducted to analyse the mean difference between each setting.

Field notes of observation and video-observation were used to support the description of the findings from quantitative data,

Criteria from CCOG was used to identify themes and provide additional explanatory variances capturing the meaning of data. Iterative process was conducted with frequent discussions among the researchers to ensure agreement and consistency in analysing. A reflection on how these data might influence the research questions and findings, as well as the theoretical interest of the study, accompanied this process.

Results

Thirty-three students initially agreed to participate; 14 students were observed in all three settings and 14 students had incomplete observations. Five students withdrew from this study – one due to moving clinical schools and being unable to arrange observation and four students withdrew after one observation was conducted. These withdrawals were associated with challenges scheduling further observations and other undisclosed personal reasons. A total of 63 unique observations were included in the final analysis. Table 1 summarises the demographic characteristics of the participants and the number of observations in each setting. Table 2 provides observation time in each setting, and Table 3 shows the average score of students’ performance on each communication skill task.

Table 1 Characteristics of participants and observations
Table 2 Observation time in each setting
Table 3 Average scores of student performance based on communication skills tasks of the Calgary-Cambridge Observation Guide

The overall quantitative performance of students did not differ significantly across settings. The average score for the overall performance in classroom, clinical and OSCE settings was 4.2, 4.3 and 4.2, respectively corresponds to performances which were satisfactory and appropriate for their study level. Nine of 14 students (64%) with complete observations received the same classification (satisfactory) for all settings. Further analysis showed that the performances in the separate components were not statistically significantly different across settings, except for providing structure and closing the session (p = 0.005 and p = 0.02, respectively). Key differences were found, however, in specific areas of the communication micro-skills across learning environments and this was probably due to the different opportunities to demonstrate skills. (See Supplement material for more detailed micro-skills scores).

We explored the observations from both quantitative and qualitative perspective, considering both scores and rating on the CCOG, and descriptions of the observation themselves. Students started the consultation by establishing initial rapport, and identifying the reason(s) for the consultation. In the classroom, some students did not perform these tasks as thoroughly as would be expected in a real clinical encounter, in part because many began the role-play as a follow-on from a peer. In the clinical environment, students had longer and unhurried discussions with patients and the tasks associated with initiating the session were performed well. During OSCE, students often rushed to enter the room, sanitise their hands, and greet the patient before they had even reached their chair.

Across all settings, students effectively gathered information by exploring the patient’s problem. Attentive listening and facilitation of patient responses were evident, especially in classroom and clinical environments. They were able to encourage [the] patient to tell her/his story of the problem, use open and closed questioning techniques, and use concise, easily understood questions and comments. Students also demonstrated an ability to listen attentively and facilitate patient’s responses appropriately, particularly in classroom and clinical settings. In the classroom, students used additional skills for understanding the patient’s perspective by exploring patient’s ideas, concerns, expectation, effects, and feelings, in the clinical setting these micro-skills were observed only in three out of the seventeen clinical observations. In the OSCE, only one of sixteen students performed well in this area (average scores were 1.5, 0.8 and 0.7, respectively in classroom, clinical settings and OSCE). During OSCEs, students tended to rush taking the history of the patient’s problem, primarily using closed-questions in order to complete the OSCE task.

Students generally applied micro-skills of attending to flow to provide the structure of the interview. They structured interview[s] in a logical sequence and attended to timing and kept the interview on task. However, the micro-skill of using signposting or transitional statements was only observed in seven students (25.9%) in the classroom setting. The nature of classroom role-plays in which students swap roles with their peers likely limited the use of transitional statements, as transitions were often used as a point to pause and move to another student. In contrast, in the clinical setting, students tried to follow patients’ cues to make the interactions flow conversationally, despite using a standard history-taking template. In addition, students were rarely observed summarising at the end of a specific line of inquiry to confirm understanding before moving on to the next section. During the OSCE, students rarely structured their consultations with only four of the 16 students summarising information gathered from the patient.

The tasks of building a relationship were demonstrated consistently in classroom and clinical settings, but less so in the OSCE setting. Students were able to demonstrate appropriate verbal and non-verbal behaviour (eye contact, facial expression, vocal volume, and tone) and develop rapport. However, micro-skills of involving the patient which include share thinking with a patient to encourage patient’s involvement were rarely observed in any setting.

The tasks of “Explanation and planning” were observed in the classroom and OSCE settings, but not in the clinical environment. It is not appropriate for students to independently make diagnoses or plan clinical management for real patients. The focus of the interaction in the clinical environment was eliciting a patient’s history to develop clinical reasoning and communication skills. Only a minority of students in the classroom setting had the opportunity to practise the counselling components of a consultation, therefore, their ability to demonstrate explanation and planning and to involve the patient was rarely observed in their role-play. In the OSCE, this was an expected as part of the assessment.

In classroom observations, most of the students could not close the session because the role-plays were stopped by the facilitators before this point. Those who had the opportunity to practise these skills in classroom effectively contracted with the patient, the next steps for both the patient and physician and made a final check that the patient agreed and was comfortable with the plan. While in the clinical setting, after eliciting a history from the patient, the students closed the session by summarising the information gathered and expressing their gratitude to the patients. In the OSCE, due to the time limitation, none of the students was able to close the session. Rather, they rushed to leave the room when the time was over.

Only two of the encounters observed in the clinical environment were also observed by clinical facilitators, and the feedback provided was focused on medical knowledge. Debrief discussions with students suggest that this was reflective of the low level of observation experienced overall. On the other hand, in the classroom settings, the facilitators provided feedback mostly about students’ communication skills, while during the OSCE no feedback was provided to the students.

Discussion

Main findings

This study observed 28 students applying communication skills in different learning environments, including assessment. The results highlighted disparities in the practice and focus of skills across settings. The findings revealed that in the classroom students can practise a broad set of communication skills tasks (though not usually each student in a single role-play), however, in the clinical environment, information-gathering and relationship-building with patients were the focus of their encounters. In the OSCEs, limited time and high-pressure scenarios caused the students to solely focus on information-gathering and/or explanation, diminishing opportunity for rapport-building with the patient. These findings indicated a poor alignment between the skills practiced across learning environments. While quantitative differences in communication skills between settings were not statistically significant, important differences emerged in the patterns of skills displayed and components of the consultation practiced in each setting.

The study revealed a disconnection between structured communication skills learned in classrooms and experiences during clinical placements. Simulation-based learning in classrooms offered a safe space for difficult conversations and feedback, aiding preparation for real patient interactions, as also reported elsewhere [46]. Students view simulated patient interaction as a valuable opportunity to prepare themselves for real patient interactions, especially with the ability to “pause” whenever they encounter difficulties [47]. The classroom could fill in an important gap because students cannot appropriately perform many of the more complex tasks with real patients.

The students who participated in this study were trained in using open and closed questioning techniques, listening attentively, facilitating patient’s responses verbally and non-verbally, picking up verbal and non-verbal cues, using concise, easily understood questions and comments, and determining and exploring patient’s ideas, concerns, expectations, and feelings in the classroom setting. The ability to apply these skills is crucial in patient-centred care and contributes to developing relationships with patients [48]. Yet, these skills were less evident in assessment. Limited time and high-pressure scenarios used in OSCEs restricted students’ opportunity to explore these aspects in the assessment context. It seems that this type of assessment risks devaluing these skills and limits the authenticity of assessment by breaking skills into artificial components rather than assessing them as part of an integrated whole.

Students can practise many communication skills in classroom sessions, but not all tasks can be rehearsed in other learning environments, as also reported in other studies [32, 33, 49]. In the classroom setting, explanation and planning tasks were disseminated across several students performing role-play. On the other hand, students were not required to perform these tasks in the clinical environment. Undergraduate medical students do not have direct responsibility for comprehensive patient care [50, 51]. Their interactions with real patients are conducted under the (often indirect) supervision of attending physicians who act as clinical facilitators and have clinical responsibility. Despite this, the explanation and planning task was evaluated during OSCEs. Although students were able to demonstrate adequate knowledge, they did not use a structured explanation approach based on CCOG while delivering information. This OSCE station, in keeping with those used commonly in medical education programs [22, 52], involved limited time to complete complex clinical tasks. These findings again indicate the misalignment of teaching and assessment, particularly in the communication task of explanation and planning.

Clinical encounters facilitated students’ information-gathering skills and clinical reasoning [46]. However, limited observations by clinical facilitators during these encounters might have hindered their true benefits [32, 49]. In this study, only two clinical encounters were observed by clinical facilitators. In this setting, communication skills are learnt mainly through role-modelling by the supervisors. Patient-centred communication skills learnt in the classroom can be diminished during clinical rotation when students face barriers, such as inconsistent modelling of communication skills by clinicians and other health professions in clinical environments [53,54,55,56,57].

The process of feedback and reflection is helpful to consolidate skills during training [58,59,60,61]. Specific feedback on communication skills is suggested to improve students’ ability to handle a patient’s emotions and perceptions, as well as the structure and end of the conversation [59]. A Cochrane review reported that while most educational interventions can have positive impacts on communication skills measured in the short-term post intervention, those involving specific, personalised feedback are likely to have the most impact [8].

In this study, the feedback from facilitators was common, tailored, and received well during classroom sessions. However, not all students had the opportunity to practise with a simulated patient and receive personalised feedback on their performance. Only half of the students in one workshop interacted with simulated patients while the remainder only observed the interactions. Feedback was very limited in the clinical environment, with only two clinical encounters observed by clinical facilitators, and the feedback focused on medical knowledge. In summative OSCEs, feedback was limited to the assessment outcome or grade. While the examiners rated student’s performance based on pre-determined marking criteria, the majority of points were related to the medical knowledge and management of the case, with only one of ten aspects evaluating students’ communication skills. It is therefore possible for a student to pass an OSCE without establishing any rapport with a patient or involving the patient in the consultation. Such a student is receiving a message which is likely to be very different from the feedback they would receive on the same performance in a classroom session. Again, it shows the discrepancy of teaching and learning in different settings.

The style and type of feedback provided shows substantial discrepancy across learning environments, in keeping with the previous literature [49]. Inconsistent or absent feedback received during clinical rotations can be counterproductive and reinforce poor practices [20]. This practice leads to a misalignment between students’ understanding of good communication skills, based on the models they observe and practise in clinical environments, and what they are expected to do in an OSCE that focuses more on content than the interview process, as discussed in other study [36].

Clinical communication skills learning not only teaches students a medical consultation structure but also how and when to apply it in different contexts, using the micro-skills that increase efficacy [62]. Students need to gradually learn from simulated patient interactions and real patient encounters, in increasingly complex cases [63, 64] to be able to develop the flexibility and capability to apply their communication skills appropriately to patients in different contexts [65]. However, the misalignment of learning and assessment illustrated in this study may contribute to the difficulty of applying communication skills across learning environments.

Limitations of the study

This study involved a group of students in the same year of a single undergraduate entry medical program, potentially limiting the generalisability of findings to other programs or different student cohorts. However, the nature of clinical and assessment experiences is reasonably consistent across medical programs [66, 67]. The voluntary nature of participation might skew results towards a more motivated and confident group, not necessarily representative of the whole cohort. However, the mix of students with regard to gender, age and background lends weight to the validity of these observational data.

This study observed students on a single occasion for every setting. It did not follow the students to the following year to confirm whether the skills remained or changed over a more extended period. In OSCEs, the case used might not be the most suitable case to evaluate these skills. Each of the students observed in OSCEs were marked as having at least met the required standard for the station as a whole. In addition, using video observation might cause observer effect, observer bias and observer expectation. Since the students were aware of being observed, they might have performed better than normal or shown a “halo effect” [41]. However, video observations in this study captured verbal and non-verbal behaviour during the encounter and could be replayed for rating purposes. Evaluation of students using longitudinal observation across settings during their regular learning environments might better capture student performance.

Conclusion

This study observed students during communication skills learning and assessment. Not all aspects of communication skills can be practised in all learning environments. Classroom workshops attempt to cover every aspect of communication skill, often spread across several students. In contrast, in the clinical environment, students focus mainly on information-gathering, while in the OSCE, students are often tasked with performing an isolated task such as gathering a history, or explanation and planning with a simulated patient. Students are required to build a relationship in all settings, but in the eight-minute OSCE this is particularly challenging. The differences between these learning and assessment settings mean that students do not receive clear messages about what is valued and prioritised in terms of clinical communication.

The critical components of role-play practice, feedback, observation, and supervision are well-acknowledged, but the quality of application of each of these components differs across learning environments. The misalignment of teaching and assessment may contribute to students’ confusion when transferring their communication skills to different learning environments. Students would benefit from opportunities to rehearse and practise in different learning environments and receive feedback on their performance in each setting to help them transfer the skills across learning environments and develop their flexibility and capability. Combining formal communication skills in classroom sessions with experiential learning during clinical rotations, coupled with observation and feedback, may be an effective approach to develop an understanding of both the theoretical content and practical application of communication skills. However, the efficacy of this approach hinges on the alignment of teaching, learning and assessment of communication skills across learning environments, including the role-modelling of communication skills by clinicians. Further research is needed to investigate the development and application of students’ skills over the long term to understand supports for and barriers to effective teaching and learning of doctor-patient communication skills in different learning environments.

Data availability

Our data are available in the Open Science Framework repository at https://osf.io/5xayz/.https://osf.io/ezqh6/files/osfstorage/655d581e062a3e3da1ee05fb.

Abbreviations

CCOG:

Calgary-cambridge observation guide

ISS:

Interactional skills’ sessions

JMP:

Joint medical program

OSCE:

Objective structured clinical examination

WACH:

women’s, adolescents’ and children’s health

References

  1. Stewart MA. Effective physician-patient communication and health outcomes: a review. CMAJ. 1995;152(9):1423–33.

    Google Scholar 

  2. World Health Organization. WHO strategic communication framework for effective communication. Geneva: Department of Communications, Office of the WHO Director-General; 2017. p. 56.

    Google Scholar 

  3. Gysels M, Richardson A, Higginson IJ. Communication training for health professionals who care for patients with cancer: a systematic review of training methods. Support Care Cancer. 2005;13(6):356–66. https://doi.org/10.1007/s00520-004-0732-0.

    Article  Google Scholar 

  4. Zolnierek KB, Dimatteo MR. Physician communication and patient adherence to treatment: a meta-analysis. Med Care. 2009;47(8):826–34. https://doi.org/10.1097/MLR.0b013e31819a5acc.

    Article  Google Scholar 

  5. General Medical Council. Outcomes for graduates (Tomorrow’s Doctor). editor. Manchester: General Medical Council; 2018. p. 28. General Medical Council.

  6. Australian Medical Council. Standards for assessment and accreditation of primary medical programs. Kingston, ACT: Australian Medical Council Limited; 2012.

    Google Scholar 

  7. Association of American Medical Colleges. Learning objectives for medical student education: guidelines for medical schools. Association; 1998.

  8. Gilligan C, Powell M, Lynagh MC, Ward BM, Lonsdale C, Harvey P et al. Interventions for improving medical students’ interpersonal communication in medical consultations. Cochrane Database Syst Rev. 2021(2). https://doi.org/10.1002/14651858.CD012418.pub2.

  9. Bos-van den Hoek DW, Visser LNC, Brown RF, Smets EMA, Henselmans I. Communication skills training for healthcare professionals in oncology over the past decade: a systematic review of reviews. Curr Opin Support Palliat Care. 2019;13(1):33–45. https://doi.org/10.1097/SPC.0000000000000409.

    Article  Google Scholar 

  10. Alelwani SM, Ahmed YA. Medical training for communication of bad news: a literature review. J Educ Health Promot. 2014;3:51. https://doi.org/10.4103/2277-9531.134737.

    Article  Google Scholar 

  11. Chung HO, Oczkowski SJ, Hanvey L, Mbuagbaw L, You JJ. Educational interventions to train healthcare professionals in end-of-life communication: a systematic review and meta-analysis. BMC Med Educ. 2016;16:131. https://doi.org/10.1186/s12909-016-0653-x.

    Article  Google Scholar 

  12. Keifenheim KE, Teufel M, Ip J, Speiser N, Leehr EJ, Zipfel S, et al. Teaching history taking to medical students: a systematic review. BMC Med Educ. 2015;15:159. https://doi.org/10.1186/s12909-015-0443-x.

    Article  Google Scholar 

  13. Kyaw BM, Posadzki P, Paddock S, Car J, Campbell J, Tudor Car L. Effectiveness of digital education on communication skills among medical students: systematic review and meta-analysis by the digital health education collaboration. J Med Internet Res. 2019;21(8):e12967. https://doi.org/10.2196/12967.

    Article  Google Scholar 

  14. Griffin SJ, Kinmonth AL, Veltmn MWM, Gillard S, Grant J, Steward M. Effect on health-related outcomes of interventions to alter the interaction between patients and practitioners: a systematic review of trials. Ann Fam Med. 2004;2(6):595–608. https://doi.org/10.1370/afm.142.

    Article  Google Scholar 

  15. Silverman J. Models of the consultation. A summary of models that have been proposed over the last 40 years. International Association for Communication in Healthcare (EACH); 2014.

  16. Schirmer JM, Mauksch L, Lang F, Marvel MK, Zoppi K, Epstein RM, et al. Assessing communication competence: a review of current tools. Fam Med. 2005;37(3):184–92. PMID: 15739134.

    Google Scholar 

  17. Setyonugroho W, Kennedy KM, Kropmans TJB. Reliability and validity of OSCE checklists used to assess the communication skills of undergraduate medical students: a systematic review. Patient Educ Couns. 2015;98(12):1482–91. https://doi.org/10.1016/j.pec.2015.06.004.

    Article  Google Scholar 

  18. Blackmore A, Kasfiki EV, Purva M. Simulation-based education to improve communication skills: a systematic review and identification of current best practice. BMJ STEL. 2018;4(4):159–64. https://doi.org/10.1136/bmjstel-2017-000220.

    Article  Google Scholar 

  19. Kurtz. Silverman, Draper. Teaching and learning communication skills in medicine. 2nd ed. Oxford: Radcliffe Publishing; 2005. p. 576.

    Google Scholar 

  20. Rosenbaum ME. Dis-integration of communication in healthcare education: workplace learning challenges and opportunities. Patient Educ Couns. 2017;100:2054–61. https://doi.org/10.1016/j.pec.2017.05.035.

    Article  Google Scholar 

  21. Junod Perron N, Sommer J, Louis-Simonet M, Nendaz M. Teaching communication skills: beyond wishful thinking. Swiss Med Wkly. 2015;145:w14064. https://doi.org/10.4414/smw.2015.14064.

    Article  Google Scholar 

  22. van der Vleuten C, van den Eertwegh V, Giroldi E. Assessment of communication skills. Patient Educ Couns. 2019;102(11):2110–3. https://doi.org/10.1016/j.pec.2019.07.007.

    Article  Google Scholar 

  23. Biggs J. Constructive alignment in university teaching. HERDSA Rev High Educ. 2014;1:5–22.

    Google Scholar 

  24. Napper VS. Alignment of learning, teaching, and assessment. In: Seel NM, editor. Encyclopedia of the sciences of learning. Boston: Springer US; 2012. pp. 200–2.

    Chapter  Google Scholar 

  25. King A, Hoppe RB. Best practice for patient-centered communication: a narrative review. J Grad Med Educ. 2013;5(3):385–93. https://doi.org/10.4300/JGME-D-13-00072.1.

    Article  Google Scholar 

  26. Fraser SW, Greenhalgh T. Coping with complexity: educating for capability. BMJ. 2001;323(7316):799–803. https://doi.org/10.1136/bmj.323.7316.799.

    Article  Google Scholar 

  27. World Federation of Medical Education. Basic medical education WFME global standards for quality improvement. The 2020 revision. Copenhagen, Denmark: WFME Office; 2020. p. 29.

    Google Scholar 

  28. Dunham L, Dekhtyar M, Gruener G, Cichoski Kelly E, Deitz J, Elliott D et al. Medical student perceptions of the learning environment in medical school change as students transition to clinical training in undergraduate medical school. Teach Learn Med. 2017:1–9. https://doi.org/10.1080/10401334.2017.1297712.

  29. Godefrooij MB, Diemers AD, Scherpbier A. Students’ perceptions about the transition to the clinical phase of a medical curriculum with preclinical patient contacts: a focus group study. BMC Med Educ. 2010;10. https://doi.org/10.1186/1472-6920-10-28.

  30. O’Brien BC, Poncelet AN. Transition to clerkship courses: preparing students to enter the workplace. Acad Med. 2010;85. https://doi.org/10.1097/ACM.0b013e3181fa2353.

  31. Prince KJAH, van de Wiel MWJ, van der Vleuten CPM, Boshuizen HPA, Scherpbier AJJA. Junior doctors’ opinions about the transition from medical school to clinical practice: a change of environment. Educ Health. 2004;17(3):323–31. https://doi.org/10.1080/13576280400002510.

    Article  Google Scholar 

  32. Rosenbaum ME, Axelson R. Medical education: curricular disconnects in learning communication skills: what and how students learn about communication during clinical clerkships. Patient Educ Couns. 2013;91(1):85–90. https://doi.org/10.1016/j.pec.2012.10.011.

    Article  Google Scholar 

  33. Malhotra A, Gregory I, Darvill E, Goble E, Pryce-Roberts A, Lundberg K, et al. Mind the gap: learners’ perspectives on what they learn in communication compared to how they and others behave in the real world. Patient Educ Couns. 2009;76(3):385–90. https://doi.org/10.1016/j.pec.2009.07.024.

    Article  Google Scholar 

  34. Deveugele M. Communication training: skills and beyond. Patient Educ Couns. 2015;98(10):1287–91. https://doi.org/10.1016/j.pec.2015.08.011.

    Article  Google Scholar 

  35. Cho KK, Marjadi B, Langendyk V, Hu W. Medical student changes in self-regulated learning during the transition to the clinical environment. BMC Med Educ. 2017;17(1):59. https://doi.org/10.1186/s12909-017-0902-7.

    Article  Google Scholar 

  36. Dewi SP, Wilson A, Duvivier R, Kelly B, Gilligan C. Perceptions of medical students and their facilitators on clinical communication skills teaching, learning, and assessment. Front Public Health. 2023;11. https://doi.org/10.3389/fpubh.2023.1168332.

  37. Phillips EC, Smith SE, Hamilton AL, Kerins J, Clarke B, Tallentire VR. Assessing medical students’ nontechnical skills using Immersive Simulation: what are the essential components? Simul Healthc. 2021;16(2):98–104. https://doi.org/10.1097/SIH.0000000000000463.

    Article  Google Scholar 

  38. Cömert M, Zill JM, Christalle E, Dirmaier J, Härter M, Scholl I. Assessing communication skills of medical students in Objective Structured Clinical examinations (OSCE) - a systematic review of rating scales. PLoS ONE. 2016;11(3):e0152717. https://doi.org/10.1371/journal.pone.0152717.

    Article  Google Scholar 

  39. Heal C, D’Souza K, Banks J, Malau-Aduli BS, Turner R, Smith J, et al. A snapshot of current Objective Structured Clinical Examination (OSCE) practice at Australian medical schools. Med Teach. 2019;41(4):441–7. https://doi.org/10.1080/0142159X.2018.1487547.

    Article  Google Scholar 

  40. Castro FG, Kellison JG, Boyd SJ, Kopak A. A methodology for conducting integrative mixed methods research and data analyses. J Mix Methods Res. 2010;4(4):342–60. https://doi.org/10.1177/1558689810382916.

    Article  Google Scholar 

  41. Connor L, Treloar, Higginbotham N. How to perform transdisciplinary research: qualitative study design and methods. In: Higginbotham N, Connor L, Albrecht G, editors. Health social science: a transdisciplinary and complexity perspective. Melbourne: Oxford University Press; 2001. pp. 227–65.

    Google Scholar 

  42. Parry R, Pino M, Faull C, Feathers L. Acceptability and design of video-based research on healthcare communication: evidence and recommendations. Patient Educ Couns. 2016;99(8):1271–84. https://doi.org/10.1016/j.pec.2016.03.013.

    Article  Google Scholar 

  43. Simmenroth-Nayda A, Heinemann S, Nolte C, Fischer T, Himmel W. Psychometric properties of the Calgary Cambridge guides to assess communication skills of undergraduate medical students. Int J Med Educ. 2014;5:212–8. https://doi.org/10.5116/ijme.5454.c665.

    Article  Google Scholar 

  44. Mini-clinical evaluation exercise [Internet]. The Royal Australasian College of Physicians. 2014 [cited 18 July 2017]. https://www.racp.edu.au/trainees/work-based-assessments/mini-clinical-evaluation-exercise.

  45. Gisev N, Bell JS, Chen TF. Interrater agreement and interrater reliability: key concepts, approaches, and applications. Res Social Adm Pharm. 2013;9(3):330–8. https://doi.org/10.1016/j.sapharm.2012.04.004.

    Article  Google Scholar 

  46. Yardley S, Irvine AW, Lefroy J. Minding the gap between communication skills simulation and authentic experience. Med Educ. 2013;47(5):495–510. https://doi.org/10.1111/medu.12146.

    Article  Google Scholar 

  47. Bokken L, Rethans JJ, Scherpbier AJ, van der Vleuten CP. Strengths and weaknesses of simulated and real patients in the teaching of skills to medical students: a review. Simul Healthc. 2008;3(3):161–9. https://doi.org/10.1097/SIH.0b013e318182fc56.

    Article  Google Scholar 

  48. Matthys J, Elwyn G, Van Nuland M, Van Maele G, De Sutter A, De Meyere M, et al. Patients’ ideas, concerns, and expectations (ICE) in general practice: impact on prescribing. Br J Gen Pract. 2009;59(558):29–36. https://doi.org/10.3399/bjgp09X394833.

    Article  Google Scholar 

  49. Schopper H, Rosenbaum M, Axelson R. I wish someone watched me interview’. Medical student insight into observation and feedback as a method for teaching communication skills during the clinical years. BMC Med Educ. 2016;16(1):1–8. https://doi.org/10.1186/s12909-016-0813-z.

    Article  Google Scholar 

  50. Steven K, Wenger E, Boshuizen H, Scherpbier A, Dornan T. How clerkship students learn from real patients in practice settings. Acad Med. 2014;89(3):469–76. https://doi.org/10.1097/ACM.0000000000000129.

    Article  Google Scholar 

  51. Curry RH. Meaningful roles for medical students in the provision of longitudinal patient care. JAMA. 2014;312(22):2335–6. https://doi.org/10.1001/jama.2014.16541.

    Article  Google Scholar 

  52. Kiessling C, Tsimtsiou Z, Essers G, van Nuland M, Anvik T, Bujnowska-Fedak MM, et al. General principles to consider when designing a clinical communication assessment program. Patient Educ Couns. 2017;100(9):1762–8. https://doi.org/10.1016/j.pec.2017.03.027.

    Article  Google Scholar 

  53. Bombeke K, Van Roosbroeck S, De Winter B, Debaene L, Schol S, Van Hal G, et al. Medical students trained in communication skills show a decline in patient-centred attitudes: an observational study comparing two cohorts during clinical clerkships. Patient Educ Couns. 2011;84(3):310–8. https://doi.org/10.1016/j.pec.2011.03.007.

    Article  Google Scholar 

  54. Wilcox MV, Orlando MS, Rand CS, Record J, Christmas C, Ziegelstein RC, et al. Medical students’ perceptions of the patient-centredness of the learning environment. Perspect Med Educ. 2017;6(1):44–50. https://doi.org/10.1007/s40037-016-0317-x.

    Article  Google Scholar 

  55. Alimoglu MK, Alparslan D, Daloglu M, Mamakli S, Ozgonul L. Does clinical training period support patient-centeredness perceptions of medical students? Med Educ Online. 2019;24(1):1603525. https://doi.org/10.1080/10872981.2019.1603525.

    Article  Google Scholar 

  56. Joynt GM, Wong W-T, Ling L, Lee A. Medical students and professionalism – do the hidden curriculum and current role models fail our future doctors? Med Teach. 2018;40(4):395–9. https://doi.org/10.1080/0142159X.2017.1408897.

    Article  Google Scholar 

  57. Karnieli-Miller O, Vu TR, Holtman MC, Clyman SG, Inui TS. Medical students’ professionalism narratives: a window on the informal and hidden curriculum. [Erratum appears in Acad Med. 2011;86(1):29]. Acad Med. 2010;85(1):124 – 33. https://doi.org/10.1097/ACM.0b013e3181c42896.

  58. Cushing A, Abbott S, Lothian D, Hall A, Westwood OM. Peer feedback as an aid to learning. What do we want? Feedback. When do we want it? Now! Med Teach. 2011;33(2):e105–12. https://doi.org/10.3109/0142159X.2011.542522.

    Article  Google Scholar 

  59. Engerer C, Berberat PO, Dinkel A, Rudolph B, Sattel H, Wuensch A. Specific feedback makes medical students better communicators. BMC Med Educ. 2019;19(1):51. https://doi.org/10.1186/s12909-019-1470-9.

    Article  Google Scholar 

  60. Branch WT, Paranjape A. Feedback and reflection: teaching methods for clinical settings. Acad Med. 2002;77(12):1185–8. https://doi.org/10.1097/00001888-200212000-00005.

    Article  Google Scholar 

  61. Burgess A, van Diggele C, Roberts C, Mellis C. Feedback in the clinical setting. BMC Med Educ. 2020;20(2):460. https://doi.org/10.1186/s12909-020-02280-5.

    Article  Google Scholar 

  62. Durning SJ, Artino AR. Situativity theory: a perspective on how participants and the environment can interact: AMEE Guide 52. Med Teach. 2011;33(3):188–99. https://doi.org/10.3109/0142159X.2011.550965.

    Article  Google Scholar 

  63. Lapping J, Duvivier R. Twelve tips for medical curriculum design from a cognitive load theory perspective. Med Teach. 2016;38:669–74. https://doi.org/10.3109/0142159X.2015.1132829.

    Article  Google Scholar 

  64. Young JQ, Van Merrienboer J, Durning S, Ten Cate O. Cognitive load theory: implications for medical education: AMEE Guide 86. Med Teach. 2014;36(5):371–84. https://doi.org/10.3109/0142159X.2014.889290.

    Article  Google Scholar 

  65. Salmon P, Young B. Creativity in clinical communication: from communication skills to skilled communication. Med Educ. 2011;45. https://doi.org/10.1111/j.1365-2923.2010.03801.x.

  66. Junod Perron N, Klockner Cronauer C, Hautz SC, Schnabel KP, Breckwoldt J, Monti M, et al. How do Swiss medical schools prepare their students to become good communicators in their future professional careers: a questionnaire and interview study involving medical graduates, teachers and curriculum coordinators. BMC Med Educ. 2018;18(1):285. https://doi.org/10.1186/s12909-018-1376-y.

    Article  Google Scholar 

  67. Laidlaw A, Salisbury H, Doherty EM, Wiskin C. National survey of clinical communication assessment in medical education in the United Kingdom. BMC Med Educ. 2014;14(1):10. https://doi.org/10.1186/1472-6920-14-10.

    Article  Google Scholar 

  68. Dewi S. Communication skills teaching and learning in undergraduate medical education: from classroom to bedside. Newcastle, NSW, Australia University of Newcastle; 2021.

Download references

Acknowledgements

The content of this manuscript presented in part online within the author’s PhD thesis (Dewi, SP (2021) Communication skills teaching and learning in undergraduate medical education: from classroom to bedside [dissertation/master’s thesis]. [Newcastle (NSW)]: University of Newcastle, Australia). This study could not have undertaken without contribution and help from students and facilitators of the Joint Medical Program (JMP) – University of Newcastle, Australia.

Funding

This study was funded by the Higher Degree Research Fund from the University of Newcastle, Australia and Indonesia Endowment Fund for Education. The publication of this paper is supported by Universitas Padjadjaran, Indonesia.

Open access funding provided by University of Padjadjaran

Author information

Authors and Affiliations

Authors

Contributions

SPD, CG, AW contributed to conceptualization, methodology and supervision of the study. SPD and CG organized data curation and analysis. SPD wrote original draft of manuscript. All authors contributed to writing – review and editing the submitted version.

Corresponding author

Correspondence to Sari Puspa Dewi.

Ethics declarations

Ethics approval and consent to participate

This study was approved by the University of Newcastle Human Research Ethics Committee (H-2018-0152) and the Hunter New England Human Research Ethics Committee (2018/PID00638). All participants signed informed consent to participate in this study.

Consent for publication

Not applicable.

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary Material 1

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Dewi, S.P., Wilson, A., Duvivier, R. et al. Do the teaching, practice and assessment of clinical communication skills align?. BMC Med Educ 24, 609 (2024). https://doi.org/10.1186/s12909-024-05596-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12909-024-05596-8

Keywords