Skip to main content

Table 5 Principles and recommendations, from a systematic review and meta-analysis on undergraduate radiology education

From: What works in radiology education for medical students: a systematic review and meta-analysis

Principles

Notes

Recommendation

Study design

Comparison of the intervention with a control group.

Many excluded articles encountered educational research without a control. These ‘justification studies’ usually yield large effect sizes and are not necessarily informative on efficacy of an intervention relative to an existing program [86].

Studies comparing educational interventions should include direct comparison between a control and intervention cohort. Ideally, pre- and post-test should be conducted for both groups and post-test results adjusted or randomisation stratified for baseline differences.

Evaluation of quantitative knowledge / skill assessment, rather than subjective perceived gains in knowledge.

Many excluded articles involved qualitative analysis without quantitative analysis of knowledge or skill acquisition. Perceptions can differ from objectively measured attainment of knowledge or skills [65, 66].

Where research question includes assessment of knowledge / skills gains, quantitative analysis of knowledge and / or skill acquisition yields should be conducted instead of assessment of participants opinion on knowledge / skills they gained.

Definitions of experimental and control group educational interventions.

Ambiguity in descriptions of educational interventions can limit accurate comparison and reproducibility. Often control groups were reported in less detail.

Detailed description of both educational interventions and control treatment should be reported. As a minimum, this should include student cohort, studied topics, methods of delivery and teaching time.

Immediate post-intervention versus delayed testing.

Knowledge fades over time [87,88,89] and delayed testing could inform on the degree of reinforcement required to maintain knowledge for future clinical practice. Only a small number of studies described delayed testing.

Studies using short and long-term knowledge retention testing should be conducted when evaluating medical student radiology education programs.

Reporting of studies with negative results (publication bias).

There was evidence of publication bias where small studies showed relatively large effect sizes.

Methodologically sound research should be published regardless of the outcome being positive or negative. Alternatively, researchers should consider using state of the art statistically principled bias correction methods [90].

Design of educational interventions

Heterogeneity in radiology education interventions and examinations.

Exposure to radiology teaching in medical schools and subsequent medical students’ imaging knowledge varies. Methods of assessment also vary as demonstrated in this study. This heterogeneity could confound study results.

Suggested radiology curricula exist [91, 92] and greater adoption of these could reduce heterogeneity for future studies. Adoption of standardised medical student radiology examinations with validated questions could also help drive more uniform curriculum development and its evaluation [93].

Thematic analysis suggests synergisms exist between radiology and anatomy education.

Cross-sectional imaging, including CT and ultrasound have been used to teach anatomy. 3D representations, when possible, may be superior compared to 2D stacks of images.

Anatomy and imaging education can be synergistic. However, the method of displaying anatomy may impact educational effectiveness. More studies are needed to investigate this.

Thematic analysis suggests active learning could produce superior gains in knowledge or skill acquisition compared to passive learning.

All studies directly comparing knowledge or skill acquisition in active learning versus passive learning had effect sizes favouring active learning. Interactivity with active learning was associated with greater student satisfaction.

Passive or didactic learning could be used to introduce theory. Active learning should be used to revise and apply theory. Examples include imaging selection or interpretation.

eLearning is equivalent to traditional face to face education.

eLearning can be synergistic with traditional teaching. However, its effectiveness varies with instructional design. Thematic analysis suggests active learning or methods utilising guided teaching are associated with higher effectiveness.

eLearning should contain an interactive component and be produced in keeping with the best principles of instructional design. Examples include ‘worked examples’ or practice questions with feedback.