- Research article
- Open Access
Structured work-based learning in undergraduate clinical radiology immersion experience
BMC Medical Education volume 21, Article number: 167 (2021)
Practical courses in undergraduate medical training often lack a didactic concept. Active participation and learning success largely depend on chance. This study was initiated to evaluate a novel concept of structured work-based learning (WBL) in the course of students’ half-day radiology immersion experience (IE).
This prospective, single-centre cohort study included 228 third-year students of the 2019 summer semester who underwent the obligatory radiology IE at a university hospital. The course was based on a novel structured WBL concept that applied established didactic concepts including blended learning, the FAIR principles of feedback, activity, individualization, and relevance, and Peyton’s four-step approach. Outcomes of equal weight were student and supervisor satisfaction with the clinical radiology IE assessed by paper-based- and online survey, respectively. Secondary outcome was achievement of intended learning outcomes assessed by means of mini clinical evaluation exercises and personal interviews.
Satisfaction with structured WBL was high in 99.0% of students. Students’ expectations were exceeded, and they felt taken seriously at the professional level. Dissatisfaction was reasoned with quality of learning videos (0.6%), little support by supervisors (0.5%), or inadequate feedback (0.6%). Supervising resident physicians rated achievement of intended learning outcomes regarding cognitive and psychomotor competences as excellent for all students. Personal interviews revealed achievement of affective competence in some students. Twelve of 16 (75.0%) supervising physicians were satisfied with focussing on intended learning outcomes and student preparation for IE. Two of 15 (13.3%) supervisors were unsatisfied with time spent, and 4 of 16 (25%) with the approach of assessment.
This study demonstrated that both students and supervisors were satisfied with the novel concept of structured WBL within the scope of clinical radiology IE. Achievement of intended learning outcomes was promising.
Work-based learning (WBL) in general refers to learning in a work setting as part of a formal education program. In Germany, work-based learning regarding clinical radiology is offered to third-year medical students as mandatory half-day clinical radiology immersion experience (IE) and, as a matter choice, as 3-month part of the sub-internship. During this time, medical students should apply their theoretical knowledge and practical skills in real workplaces at the radiology department.
Although, WBL is considered as basically useful, lack of student and supervisor motivation could jeopardize success . In our department of radiology, students complained that during their clinical radiology IE, they primarily had to watch resident physicians’ professional activities. They passively had to shadow their supervisors throughout the workday. Students’ IE had neither been structured in terms of learning objectives and time, nor in support. No specific didactical concept was applied. Consequently, students’ satisfaction was low and strongly depended on the individual commitment of the supervising radiologist. On the other hand, supervisors reported that students had not been prepared well for IE and supervision had increased their workload to the extent that students may not benefit from radiological IE appropriately.
To break this vicious circle and to increase both motivation and success, we developed and implemented a novel WBL concept for the half-day radiological IE that provided students with structured opportunities of supervised activities (e.g.: interview patients, inform patients, reading images, writing reports). Responsibilities between all participants were arranged within clear timelines. The structure of WBL was based on skill-centred intended learning outcomes (ILO)  to be achieved in compliance with the established didactic concepts of blended learning [3, 4], FAIR principles , and Peyton’s four-step approach . We incorporated blended learning in form of learning videos for self-determined preparation. The FAIR principles had to be respected by providing feedback, facilitating students’ professional activities, assuring individualization by selectable workplaces, and communicating relevance of ILOs. We used Peyton’s four-step approach to convey occupationally specific skills and to achieve a lasting learning success. The purpose of our study was to evaluate satisfaction with structured WBL in the course of the half-day clinical radiology IE among both students and supervisors.
Our prospective, single-centre cohort study included third-year medical students of the summer semester 2019 who underwent the obligatory half-day clinical radiology IE at the department of radiology of a German university hospital. We designed our pre-experimental study for post-test only evaluation (students) and pre-test-post-test evaluation (physicians), respectively . The radiology IE takes place in the first year of students’ clinical part of medical studies after termination of the preclinical part. In parallel, in this period, students attend their first lectures on clinical radiology. Senior- and resident physicians participated in the study as supervisors according to their regular duty rosters. Students participated in the newly implemented concept of structured WBL that was based on three established didactic approaches. First, we applied blended learning [3, 4] by providing students with learning videos to prepare themselves for topics they had to pass through during the IE. Second, we incorporated the FAIR principles (feedback, activity, individualization, and relevance) . And third, we implemented Peyton’s four-step approach (demonstration, deconstruction, comprehension, and execution) .
Medical students self-determinedly prepared for the practical radiology IE by means of two learning videos (10 min each) on the obligatory topics of the practical course. During the four-hour IE at the department of radiology, two students were allocated to one responsible senior physician and one student to one resident physician at each workplace. The actual practical training started with a 30-min guided tour around the department of radiology conducted by the responsible senior physician. Subsequently, students started their work at the first workplace. Here, supported by a resident physician, they had to read chest X-ray images and to create a medical report. At the second workplace, students had to inform a patient on preparation for magnetic resonance imaging (MRI) or computed tomography (CT) including the administration of contrast agent, and to obtain the patient’s written informed consent (under the permanent control and under the responsibility of the supervising physician). Resident physicians had to assess student’s procedural skills at both obligatory workplaces separately by means of mini clinical evaluation exercises (mini-CEX) (Supplemental information: Additional file 2). Residents were encouraged to directly explain and discuss fulfilled and unfulfilled requirements with the students (optional feedback). Afterwards, students attended the third, elective workplace. Overall, they spent 1 h at each single workplace.
Following the proceedings through the workplaces and equipped with their medical report, patient’s written informed consent, and the two completed mini-CEX forms, students underwent a 30-min debriefing including final feedback on their performance concerning the ILOs. Debriefing and feedback were conducted by the responsible senior physician (Fig. 1).
Immediately after the feedback, we asked all students to participate in a paper-based survey on seven issues regarding their satisfaction with the new concept of the structured WBL experience. In addition, research assistants interviewed volunteer students on their experience. We invited all supervisors to participate in an online survey (Lime survey: an open-source survey tool; LimeSurvey GmbH, Hamburg, Germany). Questions corresponded to three- or five-point bipolar Likert scales. For evaluation, we anonymized all survey data, and thus, local ethics committee exempted our study from the obligation of approval.
Intended learning outcomes
Before the practical course took place, we described and communicated ILOs according to Biggs and Tang  to the students:
I am able to apply a systematic approach to read a chest X-ray image and to prepare a radiological report.
I am able to conduct patient information on preparation for MRI/CT.
I am able to reflect relevance of acquired knowledge and skills in clinical radiology.
ILOs were chosen due to relevancy for students’ later professional practice. Description was student-centred and thus, emphasized the value the radiology IE should bring to the students (principle of “What’s in it for me?” [WIIFM]) . During IE, medical students should have been engaged to activities most appropriate to the ILOs. Already in the planning phase of the structured WBL concept, we aimed to establish constructive alignment [2, 9] across ILOs, student’s activities at workplaces, and assessment of tasks. We aligned ILOs to real-world clinical situations.
Applied didactical concepts
In the course of the radiology IE, we mainly based structured WBL on the application of three didactical approaches: blended learning, FAIR principles, and the Peyton’s four-step approach (Fig. 1).
Blended learning [3, 4, 10, 11] in our study designates the combination of a pre-IE online delivery of content and an instruction that combines learning videos with face-to-face experience at workplaces. Beforehand, the radiology faculty had prepared two experience-level appropriate videos on both obligatory workplace topics of 10 min in length each. Videos were available on the university content management system for students at all times and students were asked to watch them for clinical radiology IE preparation. Students were already familiar with the concept of blended learning because it is routinely applied in other radiology lectures, seminars, and courses.
Harden et al.  had introduced the FAIR principles of teaching to increase effectiveness of learning. Principles include the recommendation to provide appropriate feedback to students, to engage active learning, to align learning to individual needs, and to make learning relevant to intended outcomes. We incorporated the four principles into the structured WBL concept (Supplementary information: Additional file 1).
During their radiology IE, students received optional timely feedback from the resident physicians in the course of two mini-CEX on the obligatory workplace topics. In addition, at the end of radiology IE, the responsible supervisor provided a closing feedback. We instructed supervisors to be explanatory and specific. Feedback should refer to ILOs.
Active engagement assures sustainable learning success. Thus, students were actively involved in the diagnostic clinical setting of the workplaces at the department of radiology.
Students could choose the radiologic topic of their third workplace that best suited their individual needs. Six options were available: ultrasound examination, interventional radiology, X-ray, magnetic resonance imaging, computed tomography, and mammography.
We applied relevance as criterion for selection of the workplace topics and the corresponding assessment tasks. In the clinical context, students can realize usefulness of ILOs by applying theory into practice.
Peyton’s four-step approach
We integrated the task-centred Peyton’s four-step approach  into the structured WBL concept. The first step of demonstration included watching of learning videos and observing and listening to the resident physician at the workplace.
During the second step of deconstruction, the resident slowly repeated the respective activity and described each procedural step in detail. Students may have asked questions. The third step of comprehension allowed the student to guide the resident through the procedure. Finally, the fourth step of execution consisted of independent performance of the activity by the student on his/her own (Table 1).
Primary outcomes of equal weight were student and supervisor satisfaction with the clinical radiology IE according to paper-based and online survey, respectively. Secondary outcome was achievement of ILO by mini-CEX and personal interview.
Assessment of intended learning outcomes
Residents assessed clinical performance of students regarding the first two ILOs at each obligatory workplace using the tool of mini-CEX . They based their assessment on direct observation of the student skills in the clinical situation. Radiographic reports were assessed based on clinical impression, structure, style, form, and wording. Mini-CEX rating forms consisted of 11 domains on the first ILO (workplace 1) and 7 domains on the second ILO (workplace 2) (Supplementary information: Additional file 2). Domains of mini-CEX included cognitive and psychomotor competencies . Cognitive domains covered factual knowledge (e.g.: naming of pathologies), problem solving (e.g.: identification of pathologies, organisation), and clinical decision making (e.g.: create a medical report). Psychomotor competencies included behavioural competencies (e.g.: checking patient data) and skill competencies (e.g.: visual: checking image quality; hearing: medical interviewing; speech: patient education). Residents rated each particular competence on a three-point scale (worthy of improvement, requirements met, excellent performance). Mini-CEX included an optional small oral and written feedback on student’s strengths and suggestions for improvement. Students handed completed mini-CEX forms to the responsible senior physician at debriefing to be used in an effort to provide a final formative and summative feedback [10, 13]. The third predefined ILO that covers affective learning (whether students value the importance of learning) will be assessed in personal interviews by research assistants.
Categorical data are presented as counts and percentages. Differences between senior and resident physicians were assessed with Fisher’s exact test. P-values refer to comparison of proportions of satisfied and unsatisfied votes between senior and resident physicians without consideration of neutral votes. A 2-sided value of p < 0.05 was considered as statistically significant. We performed analysis using XLSTAT (Version 2015.6.01.24026, Addinsoft, Paris, France).
In the summer semester 2019, we prospectively enrolled a total of 228 undergraduate medical students into our single centre observational experience on structured work-based learning related to the clinical radiology IE at the department of radiology of a German university hospital. A total of 85.5% students (195 of 228) completed the paper-based student survey and personal interview, and 61.5% of physicians (16 of 26) completed the online supervisor survey (senior physicians 71.4% [10 of 14], resident physicians 50% [6 of 12]). Resident physicians rated mini-CEX of all students as “excellent performance” in all domains.
In general, the majority of students were very satisfied (87.2%) or satisfied (11.8%) with the structured work-based learning they experienced during clinical radiology IE. The proportion of unsatisfied students was low (0.5% regarding support by supervisor and 0.6% regarding learning videos and feedback, respectively) (Fig. 2).
Personal interviews immediately following the radiological IE revealed that preparatory learning videos were considered useful and attractive. Self-determined preparation increased students’ confidence at the workplaces. However, students criticised a partially poor transposition such as lengthy reports on too many details or insufficient video sound quality. They suggested to provide handouts and additional videos on elective workplaces.
Regarding the obligatory topics of chest X-ray and patient information, students were divided according to appropriateness. Some of the students would have preferred radiology-specific topics such as MRI and CT instead of patient information.
Support from supervisors was deemed best in chest X-ray image reading. Due to the high workload at the CT workplace at the emergency department, some of the students experienced poor support at their elective workplace. Under conditions of visual and acoustic unease, some students would have preferred a separate room for patient information. Closing feedback was generally experienced as appreciative. Students felt themselves to be taken seriously by supervisors. However, in the view of a few students, feedback was redundant.
Timetable and switching between workplaces were broadly accepted. In addition, students suggested to take greater account of the patient’s way through the diagnostic steps to gain overview of the radiological pathway. Finally, students were positively impressed by the organisation of the radiological IE. Their expectations were exceeded. IE not only gave insights and increased occupational identity but also got some of the participants strongly interested in the field of radiology (Table 2).
A total of 75% supervisors (12 of 16) were satisfied with preparation of students for the workplaces by means of learning videos. However, they stated that some of the students had admitted not to have watched the videos and thus, numbers might be underreported. Regarding obligatory topics, 75% of supervisors were satisfied. One senior physician complained on the limited individual freedom to teach. Two senior physicians (13.3%) were unsatisfied with the time necessary for supervision and two senior and resident physicians respectively (25%) were unsatisfied with the mini-CEX evaluation questionnaire. The rating of student preparation, obligatory topics, time spent, and mini-CEX did not differ significantly between senior and resident physicians. Fifty percent (5 of 10) of senior physicians rated the value of final feedback as high and 20% (2 of 10) as low (Fig. 3a).
About two thirds (60% [9 of 15]) of the supervising physicians would offer a clinical radiology IE in accordance with the underlying concept of work-based learning once again (Fig. 3b). They acknowledged the clear guidance and the improved quality of structure and organisation.
Our single centre, observational study evaluated satisfaction with structured WBL among students and supervisors in the course of a four-hour clinical radiology IE. Core of structured WBL was constructive alignment  of competence- and skill-centred ILOs with content of preparatory learning videos, workplace activities, assessment, and feedback. The study demonstrated that the majority of students were satisfied with the structured manner of IE. They felt professionally appreciated. A few shortcomings regarding technical and didactic quality of learning videos and temporarily limited support by supervisors in case of high workload at the emergency department were reported. Satisfaction of supervisors was somewhat lower, in particular regarding assessment of ILOs by mini-CEX and time spent for supervision.
At workplaces, students were supported and challenged by resident physicians to participate and apply their competences. Applying clinical knowledge and skills with actual patients, requires students to organize and recall information and to use heuristics (“mental shortcuts”) representing a high level of academic performance . Thus, we expected our students to achieve high level ILOs. In accordance with Biggs and Tang  we assume that motivation and learning is enhanced not only with relevance but also with challenging ILOs. Timely communication of ILOs and constructive alignment to all steps of the radiology IE up to the final feedback were the backbone of structured WBL. In the future, a team of supervisors might be involved to a greater extend in the decision and revision process on ILOs.
Previous studies found learning videos to be as effective as live lectures in preparation of medical exams . Afzal and Masroor  reported on no difference in outcomes of the end-of-radiology clerkship test regarding chest X-ray interpretation between students who underwent blended learning or traditional lecture. Videos allow students to determine processes and to repeat the lecture at any time. In addition, preparation gains time and encourages student confidence at workplaces. Thus, in our opinion, self-determined learning by watching videos on ILOs served as appropriate preparation for workplace activities. However, “demonstration” is only the first step of Peyton’s approach and further steps including active decision making and performance are required for lasting success.
Self-determined pre-IE preparation and the one-to-one assignment between student and resident supervisor may have helped to increase students’ confidence and communication at workplaces, and, in turn, to improve motivation and performance of ILOs . Throughout the IE, students were supported to change from observing into contributing participation. They had to step outside their comfort zone.
Peyton’s four step approach that progressively leads students to clinical performance, had already been proven to be superior over standard instruction  with Peyton’s step 3 (“trainee talks trainer through”) accounting for the biggest share of success regarding task and memory performance compared to steps 1 and 2 . Observation of skills combined with (motor) imagery of performance is considered to be superior to observation alone.
Support to students was provided in form of organizational conditions of radiology IE structure including faculty resources, timetable, and organized activities, in form of formal didactic support by supervisors, and by affective support through positive attitudes towards students . In addition, due to increased consideration of resources and time management, structured WBL might have given supervising residents greater backing for the challenge to manage student training, patient needs, and administrative duties.
Key objective of the assessment of clinical skills is to facilitate learning . The assessment tool of mini-CEX was aligned with ILOs, activities at the workplaces, and the feedback. Assessment intended to cover how well students achieved the ILOs, and not how well they reported what supervisors told them. However, resident physicians, who conducted the formative assessment were potentially biased by the relationship with their students and their process of learning at the workplace. As a result, in our study, residents consistently judged students’ performance as excellent. Poor accuracy of faculty assessment was already described previously [22, 23]. Senior physicians had to provide a more objective debriefing with the help of active reflection by students. In some cases, this approach resulted in duplications and contradictions. In the future, mini-CEX might serve more as a checklist of specific observed tasks rather than a rating tool and emphasize residents’ suggestions of improvement. In general, the major purpose of feedback is to reduce the discrepancy between current and desired practices or competences . Thus, final debriefing with senior physician may include a responsive feedback dialog  to provide a “plan of action” and advice students regarding available resources .
Our study has some limitations. First, it was a single group, pre-experimental evaluation to document proficiency and process of structured WBL in clinical radiology IE. We did not compare participants’ satisfaction or outcomes of structured WBL with the traditional, radiological IE. Therefore, accomplishments may be due to factors other than the novel structured WBL concept . In addition, mini-CEX did not reflect actual competence of students due to probable rater biases and thus, did not provide reliable assessment of achieved ILOs, but rather encouraged students on their path towards professional competence. Therefore, at that time, we cannot quantify efficacy of structured WBL.
Structured WBL in the course of the clinical radiology IE was feasible and well accepted among participating students and supervisors. Students felt appreciated at the professional level and supervisors were supported by organizational conditions. From our study, we derived to continue offering structured WBL at the department of radiology. However, both diversity and quality of learning videos should be pursued. In addition, assessment and feedback should place particular emphasis on providing students with information where and how to improve or broaden their competences rather than to judge their abilities. Efficacy of structured WBL regarding learning success and sustainability has still to be demonstrated.
Availability of data and materials
The datasets generated and analysed during the current study are not publicly available due to requirements of the ethics committee but are available from the corresponding author on reasonable request.
- FAIR principles:
Principles off feedback, activity, individualization, and relevance
Intended learning outcomes
Mini clinical evaluation exercises
Sajjad M, Mahboob U. Improving workplace-based learning for undergraduate medical students. Pak J Med Sci. 2015;31(5):1272–4.
Biggs J, Tang C. Teaching for quality learning at university. 4th ed. Maidenhead: Open University Press; 2011.
Kyaw BM, Posadzki P, Paddock S, Car J, Campbell J, Tudor CL. Effectiveness of digital education on communication skills among medical students: systematic review and meta-analysis by the digital health education collaboration. J Med Internet Res. 2019;21(8):e12967.
Vallée A, Blacher J, Cariou A, Sorbets E. Blended learning compared to traditional learning in medical education: systematic review and meta-analysis. J Med Internet Res. 2020;22(8):e16504.
Harden RM, Laidlaw JM. Be FAIR to students: four principles that lead to more effective learning. Med Teach. 2013;35(1):27–31.
Peyton JWR. Teaching and learning in medical practice. In: Peyton JWR, editor. The learning cycle. Rickmansworth: Manticore Europe Limited; 1998. p. 13–9.
Thomas PA, Kern DE, Hughes MT, Chen BY. Curriculum development for medical education: a six-step approach. 3rd ed. Baltimore: Johns Hopkins University Press; 2016.
Teichgräber U, Ingwersen M, Mentzel HJ, Aschenbach R, Neumann R, Franiel T, et al. Impact of a heutagogical, multimedia-based teaching concept to promote self-determined, cooperative student learning in clinical radiology. Rofo. 2020. https://doi.org/10.1055/a-1313-7924.
Biggs J. Enhancing teaching through constructive alignment. High Educ. 1996;32(3):347–64.
Persky AM, McLaughlin JE. The flipped classroom - from theory to practice in health professional education. Am J Pharm Educ. 2017;81(6):118.
Vavasseur A, Muscari F, Meyrignac O, Nodot M, Dedouit F, Revel-Mouroz P, et al. Blended learning of radiology improves medical students’ performance, satisfaction, and engagement. Insights Imaging. 2020;11(1):61.
Norcini JJ, Blank LL, Duffy FD, Fortna GS. The mini-CEX: a method for assessing clinical skills. Ann Intern Med. 2003;138(6):476–81.
Norcini J, Burch V. Workplace-based assessment as an educational tool: AMEE guide no. 31. Med Teach. 2007;29(9):855–71.
Leeds FS, Atwa KM, Cook AM, Conway KA, Crawford TN. Teaching heuristics and mnemonics to improve generation of differential diagnoses. Med Educ Online. 2020;25(1):1742967.
Brockfeld T, Müller B, de Laffolie J. Video versus live lecture courses: a comparative evaluation of lecture types and results. Med Educ Online. 2018;23(1):1555434.
Afzal S, Masroor I, Afzal A, Babar S. Flipped classroom model for teaching undergraduate students in radiology making lectures memorable: a cognitive perspective. J Coll Physicians Surg Pak. 2019;29(11):1083–6.
Dornan T, Boshuizen H, King N, Scherpbier A. Experience-based learning: a model linking the processes and outcomes of medical students’ workplace learning. Med Educ. 2007;41(1):84–91.
Krautter M, Weyrich P, Schultz JH, Buss SJ, Maatouk I, Junger J, et al. Effects of Peyton’s four-step approach on objective performance measures in technical skills training: a controlled trial. Teach Learn Med. 2011;23(3):244–50.
Krautter M, Dittrich R, Safi A, Krautter J, Maatouk I, Moeltner A, et al. Peyton’s four-step approach: differential effects of single instructional steps on procedural and memory performance - a clarification study. Adv Med Educ Pract. 2015;6:399–406.
Dornan T, Conn R, Monaghan H, Kearney G, Gillespie H, Bennett D. Experience Based Learning (ExBL): clinical teaching for the twenty-first century. Med Teach. 2019;41(10):1098–105.
Shepard LA. The role of assessment in a learning culture. Educ Res. 2000;29(7):4–14.
Noel GL, Herbers JE Jr, Caplow MP, Cooper GS, Pangaro LN, Harvey J. How well do internal medicine faculty members evaluate the clinical skills of residents? Ann Intern Med. 1992;117(9):757–65.
Berendonk C, Rogausch A, Gemperli A, Himmel W. Variability and dimensionality of students’ and supervisors’ mini-CEX scores in undergraduate medical clerkships - a multilevel factor analysis. BMC Med Educ. 2018;18(1):100.
Hattie J, Timperley H. The power of feedback. Rev Educ Res. 2007;77(1):81–112.
Molloy E, Ajjawi R, Bearman M, Noble C, Rudland J, Ryan A. Challenging feedback myths: values, learner involvement and promoting effects beyond the immediate task. Med Educ. 2020;54(1):33–9.
The authors thank Laura Graziani for project administration and language editing and Andrea Scholz-Buchholz for student interviews and conducting surveys. Authors especially thank all supervising residents and senior radiologists who supported the students in the WBL radiology IE.
This research did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sectors. Open Access funding enabled and organized by Projekt DEAL.
Ethics approval and consent to participate
For evaluation, we anonymized all survey data, and thus, local ethics committee (Ethics commission of the Friedrich-Schiller-University Jena) exempted our study from the obligation of approval. The study was performed in accordance with the 1964 Helsinki declaration and its later amendments. Informed consent was obtained from all individual participants included in the study.
Consent for publication
The authors declare that they have no competing interests.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
About this article
Cite this article
Teichgräber, U., Ingwersen, M., Bürckenmeyer, F. et al. Structured work-based learning in undergraduate clinical radiology immersion experience. BMC Med Educ 21, 167 (2021). https://doi.org/10.1186/s12909-021-02592-0
- Distance education and online learning
- Immersion learning
- Undergraduate medical education
- Peyton’s four-step approach
- Workplace learning