Skip to main content

Comparing computer-assisted learning activities for learning clinical neuroscience: a randomized control trial

Abstract

Background

Computer-assisted learning has been suggested to improve enjoyment and learning efficacy in medical education and more specifically, in neuroscience. These range from text-based websites to interactive electronic modules (eModules). It remains uncertain how these can best be implemented. To assess the effects of interactivity on learning perceptions and efficacy, we compared the utility of an eModule using virtual clinical cases and graphics against a Wikipedia-like page of matching content to teach clinical neuroscience: fundamentals of stroke and cerebrovascular anatomy.

Methods

A randomized control trial of using an interactive eModule versus a Wikipedia-like page without interactivity was performed. Participants remotely accessed their allocated learning activity once, for approximately 30 min. The primary outcome was the difference in perceptions on enjoyability, engagement and usefulness. The secondary outcome was the difference in learning efficacy between the two learning activities. These were assessed using a Likert-scale survey and two knowledge quizzes: one immediately after the learning activity and one repeated eight weeks later. Assessments were analysed using Mann–Whitney U and T-tests respectively.

Results

Thirty-two medical students participated: allocated evenly between the two groups through randomisation. The eModule was perceived as significantly more engaging (p = 0.0005), useful (p = 0.01) and enjoyable (p = 0.001) by students, with the main contributing factors being interactivity and clinical cases. After both learning activities, there was a significant decrease between the first and second quiz scores for both the eModule group (-16%, p = 0.001) and Wikipedia group (-17%, p = 0.003). There was no significant difference in quiz scores between the eModule and Wikipedia groups immediately afterwards (86% vs 85%, p = 0.8) or after eight weeks (71% vs 68%, p = 0.7).

Conclusion

Our study shows that increased student satisfaction associated with interactive computer-assisted learning in the form of an eModule does not translate into increased learning efficacy as compared to using a Wikipedia-like webpage. This suggests the matched content of the passive webpage provides a similar learning efficacy. Still, eModules can help motivate self-directed learners and overcome the perceived difficulty associated with neuroscience. As computer assisted learning continues to rapidly expand among medical schools, we suggest educators critically evaluate the usage and cost–benefit of eModules.

Peer Review reports

Background

Digitalization of medical education has meant the development of different forms of computer-assisted learning (CAL)—defined as the use of any computer software to deliver or facilitate a learning experience [1]. One of the main proposed benefits of CAL is its flexibility and convenience [2]. It forms an important role in medical education [2, 3], has previously been implemented in studying subjects such as pharmacology, rheumatology, surgery and radiology [4,5,6,7,8,9], and could be well placed to assist in studying neuroscience and neuroanatomy [10,11,12].

Electronic modules (eModules) are a form of CAL that describe digital learning packages that can integrate written subject content with multimedia graphics, interactive questions, tailored feedback and clinical cases [13]. In neurosciences education, previous studies have shown that these features increase user enjoyment, motivation and test performance compared to traditional forms of teaching [14,15,16].

Another form of CAL is the usage of online-accessible websites with medical information, either consisting of user-generated, traditional peer-reviewed content or a mixture of both, as an educational source [17]. One of the most commonly used is the user-generated content website: Wikipedia (Wikimedia Foundation, Inc., Florida, United States) which is used by up to 94% of medical students at certain institutions [18, 19]. Its main benefits are its accessibility and the range of topics available [17], but is distinguished from eModules by the web pages’ lack of interactivity, complex 3D graphics, virtual cases and reliable peer-review.

‘Neurophobia’, the fear of neuroscience, is common among both healthcare students and professionals [20, 21]. A large-scale survey in 2014 among UK medical students found neurology as significantly more difficult to learn as compared to other specialties [22]. Contributing to neurophobia is the perception that neuroscience and neuroanatomy is challenging to understand and a subject matter that is difficult to teach [10, 23, 24]. It remains uncertain how neuroscience can best be taught and learnt [25]. The strength of CAL, such as interactive graphical representations of complex anatomy and flexible access, has been suggested to be of benefit in teaching neurology [26].

Computer-assisted learning has been implemented across medical education, but it remains uncertain which specific methods improve learning outcomes in neuroscience and help in students overcoming neurophobia. No studies to date have compared different types of CAL resources in neuroscience. This study presents a novel, interactive eModule aimed at medical students that integrates virtual clinical cases to enhance learning of fundamental neuroscience concepts and carried out a randomized control study to determine its utility as compared to a Wikipedia-like source. The primary outcome in this study was the difference in perceptions related to engagement, usefulness and enjoyment between the two learning activities and the secondary outcome was the difference in learning efficacy.

Methods

Participants

Participants were recruited from medical schools across London using administrative email lists and student social media groups. Recruitment was over the span of 7 months from March 2019 until September 2019. The inclusion criterion were any students enrolled at the time of recruitment in a U.K. medicine course and undergoing clinical placements. Excluded were healthcare professionals, medical graduates and students from disciplines other than medicine. Baseline characteristics were collected on the device used to access their allocated learning activity and previous exposure to clinical neurosciences. While all participants were medical students, some had greater exposure to neuroscience than others, since they may have completed a previous degree in neuroscience before their medical studies. For this reason, students who had previous experience at graduate level were categorised as "neuroscience postgraduate", at undergraduate level: "neuroscience undergraduate", those with no prior higher neurosciences education: "medical undergraduate”, and those that had not yet undertaken a neuroscience placement as a medical student as “secondary school”.

eModule design

The eModule on stroke and the cerebrovascular anatomy was part of a series of neuroscience modules that were designed and developed by medical students over the span of 6 months with multidisciplinary input. This included a senior neurology professor, an educational specialist and was supervised by an academic neurosurgical fellow [27]. It took an estimated 60 h to fully develop the module. The learning objectives were to recognise common presentations of a stroke, know the anatomy of the cerebral arterial system, understand the relationship between the arterial system and clinical presentation and understand the core clinical management for stroke. A case-based approach with clinically relevant neuroimaging was used to teach cerebrovascular pathology, the cerebral arterial system and was designed to take approximately 20 to 30 min to complete. The computer software Storyline 360 (Articulate Global, Inc, New York, NY) enabled interactive features: drag-and-drop, multiple-choice and click-and-point questions, and the module was accessible on Android, iOS and all flash supporting web browsers. Storyline 360 is part of an Articulate 360 subscription and costs $649 per academic user per annum [28]. All images used in the module were obtained from Creative Commons license sources. Figure 1 shows a screenshot of the eModule and a weblink and additional screenshots are provided in the Supplemental Material.

Fig. 1
figure 1

Screenshot of the eModule. An example of the drag-and-drop questions regarding the anatomy of the circle of Willis. The schematic representation of the Circle of Willis was made by Rhcastilhos [29] and released into the public domain

Wiki design

As part of the study, a Wikipedia-style page was created as a ‘control’ CAL activity using Wikidot.com (Wikidot Inc., Torun, Poland). The content of the Wiki page was identical to the eModule: including use of the same images, but without interactivity or clinical cases. The design of the Wiki page was based on the structure used in medical articles on Wikipedia. The creation of the Wiki page took an estimated 10 h. Figure 2 provides screenshots of the Wiki page and a weblink is provided in the Supplemental Material.

Fig. 2
figure 2

Screenshots of the Wiki page, highlighting the factual Wikipedia-like delivery and structure of the information on stroke and circle of Willis. The photo of circle of Willis is published with Creative Commons Attribution-Share Alike 3.0 Unported licence [30]

Study protocol

The aim of this study was to compare the efficacy and enjoyability of an interactive eModule to a ‘passive’ Wikipedia-like webpage. To test the previously described additional features of the eModule, participants were randomized using an allocation sequence to either the designed eModule or the Wiki group at the point of enrolment by only the first author as part of the study protocol. After randomization, participants were asked to complete their assigned intervention, immediately followed by a survey and quiz. After anonymized data extraction, the study authors were blinded to group allocation. Both groups were asked to complete the module on a device of their choice at a time convenient for them. It was estimated the learning portion for each group would take 20 to 30 min. Time spent on either learning activity were self-reported by participants as the different software used did not allow for objective time measurement. All participants were asked to complete a second quiz after 8 weeks to assess retainment of knowledge.

Study outcomes

The primary outcome measured in this study was the participants’ perception of their completed intervention via an online survey. For this, a five-point Likert scale, ranging from strongly disagree to strongly agree, was used to assess on how (1) enjoyable, (2) engaging, (3) recommendable and (4) useful the learning activity was perceived. Additionally, participants were asked to state which aspects of the intervention they found aided their learning, in order to assess if the matched content was found to be similarly useful between the two learning activities and to measure the proportion of the eModule group finding its unique features beneficial. The survey design was based on feedback forms used previously on the learning platform of the eModule’s design team, and reviewed by the educational specialist as part of the eModule’s implementation. To test if the two learning activities were accessed similarly, participants were asked what device was used to complete the intervention and the time spent on the intervention.

The secondary outcome was the efficacy of the learning activity, assessed using an electronic quiz completed immediately after the intervention. This quiz consisted of ten multiple-choice questions with a single best answer based on the content taught in the module, resembling the written exams commonly used in medical courses. The questions and answers of the quiz were designed by the same team who developed the eModule including a consultant neurologist and professor of neurology (both who previously, have been part of institutional faculty in setting formal examination questions). The aim was to assess the stated learning objectives of the learning activity. The same combination of questions were used in each quiz: several involving simple recall, some short case-based questions requiring 2-stage recall and a few harder questions requiring recall from multiple sources within the learning activities and weighing up of this information to determine an answer. A pre-specified answer grid of correct responses was created for each quiz. No negative marking was used, and the score is presented as the proportion of questions, answered correctly. A second quiz was sent after 8 weeks to assess retention and consisted of questions that assessed the same topics of the first quiz but with changes made to the details of the questions. The quiz was validated during a preliminary study of the eModule with 14 medical students and junior doctors, showing knowledge improvement post-completion of the activity compared to pre-completion. Participants of the preliminary study were excluded from the study reported in this paper.

Statistical analysis

The Fisher’s exact test was used to assess statistical differences in demographical information. The statistical difference of Likert-scale survey scores was assessed by using the Mann–Whitney U test, assuming ordinal data. Reliability of the survey was assessed by calculating the Cronbach’s Alpha, considering a coefficient of 0.7 or higher as acceptable internal consistency [31]. An unpaired two-tailed two-sample t-test was used to assess the statistical difference of exam results between the two groups at each time-point. Additionally, a paired two-tailed two-sample t-test was used to assess statistical difference of the score between the first and second quiz within each group. Statistical significance was defined as a p-value < 0.05. A sample size of 15 was calculated to be required to detect a large difference in knowledge retention with a power of 80%, assuming an effect size of 0.8 or higher, between the first and second quiz in each group [32]. To adjust for the effect of potential confounders: previous neuroscience experience and the device used to access the learning activity, an analysis of covariance (ANCOVA) was performed. This tested the effect of the learning activity on the results of the first quiz, second quiz and the difference between the two quizzes while accounting for these variables.

Ethics

Ethical approval for this study was registered by King’s College London as minimal risk (MRS-18/19–8122). Participation was voluntarily and none of the researchers were in a position of power or involved in the medical education of any participant. Informed consent was obtained through email from all participants. All data, including the quiz results, were only shared within the research team and not used for any purpose other than this study.

Results

Thirty-two medical students participated in the study with 16 students in the eModule group and 16 students in the Wiki group. Twelve volunteers were excluded for not being a medical student. See Fig. 3 for the participant flow diagram. The electronic device used did not significantly differ between the two groups (p = 0.5) nor did their previous neuroscience exposure (p = 1), as detailed in Table 1. The eModule group spent a mean of 26 min (95% CI 16 to 36) on their activity compared to 17 min (95% CI 13 to 22) in the Wiki group, although this difference was non-significant (p = 0.1).

Fig. 3
figure 3

Participant flow diagram of the participants enrolled in the study

Table 1 Characteristics of the eModule and Wiki group

In the eModule group, most participants found the clinical case studies and interactivity aided their learning and a smaller majority found the graphics, content and structure useful. In the Wiki group most participants found the graphics aided their learning while again a smaller majority found the content and structure useful. There was no significant difference between participants of the eModule and Wiki group in finding the graphics, content and structure aiding their learning (Table 2). The Cronbach’s alpha was 0.81 for the survey, indicating acceptable internal consistency.

Table 2 Aspects aiding learning according to participants

The eModule group found their learning activity significantly more enjoyable (p = 0.001), engaging (p = 0.0005), recommendable (p = 0.002) and useful for their studies (p = 0.01) as compared to the Wiki group. These results are shown in Fig. 4.

Fig. 4
figure 4

Comparing opinions between the eModule and Wiki group by comparing the eModule and Wiki group survey results of the Likert scale responses to the following statements. Proportions are given per option

The mean score of the first quiz was 86% (95% CI 82% to 91%) for the eModule group and 85% (95% CI 76% to 94%) for the Wiki group with no evidence of a statistical difference (p = 0.8). Fourteen participants in each group returned the second quiz after a mean of 59 days. The four participants not returning the second quiz did not significantly differ in previous neuroscience experience (p = 0.6).

There was a significant mean decline in the second quiz scores for both the eModule group (-16%, 95% CI -8% to -23%, p = 0.001) and Wiki group (-17%, 95% CI -8% to -26%, p = 0.003). Differences between the mean score of the second quiz for the e-learning group (71%, 95% CI 61% to 80%) and the control-group (68%, 95% CI 60% to 77%) were non-significant (p = 0.7). The mean quiz scores are shown in Fig. 5. After adjusting for device used and previous neuroscience experience there remained no significant effect of the learning activity on the test score of the first quiz (F(1, 24) = 0.42, p = 0.5), second quiz (F(1, 20) = 0.78, p = 0.4), or difference between the first and second quiz (F(1, 20) = 0.01, p = 0.9).

Fig. 5
figure 5

Quiz scores of eModule and Wiki Group. Mean proportional quiz scores (95% CI) and statistical difference between the eModule and Wiki Group for the first and second quiz. The first quiz was done immediately after the learning activity and the second quiz was sent 8 weeks later

Discussion

The aim of this study was to investigate if the perceptions and effectiveness of a neuroscience electronic learning module differed to a Wikipedia-style webpage. Students perceived the eModule as more engaging, useful and enjoyable, and the majority found the interactivity and clinical cases aided their learning. In contrast there was no difference found in immediate or long-term learning retention in using the eModule as compared to a Wiki-page with the same subject matter.

The learning experience using the eModule was more positively perceived compared to studying using the Wiki-style webpage. Motivation and pedagogic guidance are required for students to engage in optimal self-directed learning in medicine [33, 34]. eModules are well placed to facilitate these qualities. The structure of an eModule allows educators to focus the learning of the students and provide real-time feedback. Their interactivity is important in recruiting active participation, depth of information processing and cognitive engagement [35]. Enjoyableness in e-learning has previously been positively associated with deep learning [12], although this does not always translate to greater participant usage and uptake [36, 37]. Barriers to engagement with CAL are misaligned expectations, overwhelming volume of content and perceptions of being a passive recipient [38]. Greater motivation through the increased enjoyment and engagement of eModules may facilitate further learning and knowledge seeking behavior, but this overall effect is hard to assess.

In our study, the eModule group found their learning activity was more engaging, useful and enjoyable, but there were no significant group-wise differences with respect to perception of content, graphics or structure as adjuncts to assist learning. This suggests that the unique eModule features, namely, clinical case studies and interactivity (which represent the differentiating aspects between CAL methods) contributed the most toward a positive learning experience. Case-based teaching, which integrates theoretical neuroscience with ‘real’ clinical neurology, has previously been well documented to alleviate neurophobia [39]. Similarly, the interactivity of CAL by using multiple-choice questions and clickable graphics, is found to correlate with satisfaction and student engagement [40, 41], and that increased user engagement can be associated with increased knowledge scores [42]. Indeed, it is argued that interactive features and the aesthetic medium of a CAL platform can facilitate the understanding of difficult neuroscience and neuroanatomical concepts [10, 24]. In this study, it was likely that the eModule’s calibrated structure and use of interactivity sustained user attention and is therefore better suited for self-directed learning as compared to using passive resources. However, while these features were self-reported as being helpful, they did not translate into higher test scores.

That similar quiz scores were achieved in both groups could have been attributable to the quality of the subject content. The text and images; same in both learning activities, may have been of sufficient value to convey the learning aims without the interactivity and case studies being of additional benefit. One study, comparing three CAL modules on anatomy and physiology of the liver and biliary system with the same content but different levels of interactivity, found that students using the most passive and least interactive medium scored higher in their test compared to the other groups [43]. Similarly, the addition of complex psychosocial clinical cases to web-based modules on ambulatory medicine did not result in increased knowledge test scores for internal medicine residents, despite them finding them valuable [44]. In a meta-analysis by Cook et al. [42], across five studies comparing different modes of CAL it was found that increased user interactivity led to longer participant engagement time [44,45,46,47,48]. But this additional time was not associated with improved test scores. These studies demonstrate, at least in part, that controlling the interactivity and medium of the learning activity has less influence on learning efficacy than expected. It is possible that student engagement on CAL may not always involve active attention on the subject matter: loading software elements, passive watching of videos or clicking to progress through the modules are examples where participants might switch to ‘autopilot’ [49, 50]. Although in our study, participants stated the interactive elements aided their learning experience, some ill-implemented elements in the eModules could be distracting to some users and offset any gained learning potential.

Despite this, some individual randomized controlled trials have identified specific forms of interactivity that were, to a certain extent, associated with improved test outcomes. For example, the addition of case-based multiple-choice questions with feedback on an internet-based module for internal medicine residents was associated with significantly longer engagement time and higher test scores (78.9% ± 1.0 vs 76.2% ± 1.0, p = 0.006) as compared to the same modules without [45]. Similarly, a study of a neuropharmacology CAL module that used interactive assessment questions and pop-ups for pharmacology students resulted in higher exam scores compared to an online accessible text document with the same content [51]. Specifically, they showed that both the duration and distinct times accessing the CAL module were positively associated with higher test scores. Both studies suggest the benefit of well-designed interactive features is increased engagement resulting in higher test scores. Although there was a tendency of eModule users in this study to spend a longer time on it, this difference was not significant, nor the difference in the quiz scores.

In general, the results of this study are more similar to a larger meta-analysis by Cook et al. [52] which found a very small and inconsistent positive effect on knowledge outcomes by internet-based CAL compared with non-internet methods suggesting that content is more important than delivery. Further research should investigate, and describe more specifically, what type of interactivity utilized in CAL increases meaningful engagement and knowledge outcomes.

Limitations

This study had a number of limitations. Firstly, group sizes were relatively small and there was a drop out of two participants in each group. Although the groups were large enough to assess statistical differences in the perceptions of the learning activity and knowledge retention within in each group, the low sample size could have meant that the study was underpowered to detect a small true difference in the quiz scores between groups. Additionally, participants not returning their second quiz had no significant difference in previous neuroscience experience or device used, suggesting there was no bias due to participants lost to follow up. Secondly, the question type of the quizzes was based on multiple-choice questions commonly used in medical written exams. These test knowledge retention but do not assess critical thinking, decision making or dimensional understanding of neuroscience anatomy in detail. Higher levels of learning could be tested with more comprehensive assessments. Lastly, only one time point, after eight weeks, was used to test long-term retention of knowledge in this study. The rate of knowledge attrition is perhaps different between the learning activities—in the future this could be assessed using additional timepoints.

Implications

Modern medical students and doctors are required to be self-directed life-long learners [53]. In practice, many medical students and doctors use Wikipedia and other websites as complementary learning resources [17]. The user-generated nature of Wikipedia brings up the question of accuracy [17], but the main benefits of easy accessibility, user-friendliness and vast amount of content already attract the majority of students [18, 19]. With CAL, students can choose the content, time, place and pace of their learning [54]. Vice-versa, medical educators can use specific CAL packages to deliver standardized and accurate teaching to students and trainees across different hospital placements or even universities. This is particularly relevant in teaching clinical neuroscience, as one third of medical schools in the United Kingdom are unable to guarantee teaching from a certified neurologist [55]. Developing interactive learning modules comes with opportunity costs, including the development, delivery and maintenance [54]. As this study shows that since learning efficacy does not differ, medical educators should evaluate if the costs of developing and utilizing eModules are justified. Other alternatives include curating and vetting already existing web-based learning resources. Indeed, students and junior doctors report difficulties finding reliable websites among countless online medical resources [17, 56]. The results of this study would suggest that if easy access is combined with high quality and professional reviewed content, a text- and graphic-based Wikipedia-style website might be as effective as more sophisticated and expensive interactive CAL modules.

Computer-assisted learning is increasingly being implemented in clinical neurosciences education [26]. The principal design aim of this eModule was to tackle neurophobia by integrating basic neuroscience and neuroanatomy with relevant interactive clinical cases. The difficulty of understanding basic neurosciences and its integration with clinical neurology, has been reported by students and doctors to be major contributors to neurophobia [57]. Indeed, a study of Irish medical students found that less than one percent found they learned the most from online resources as compared to over seventy percent from bedside tutorials [58]. The addition of web-based multimedia to their neuroscience curriculum in other medical schools was found by students to be a useful addition to traditional lecture-based learning [59, 60]. Our study demonstrates that having interactive elements and a case-based approach can aid student learning. Examples of interactivity here included, but were not limited to, the drag-and drop image of the Circle of Willis and dynamic highlighting of salient findings on CT head scans i.e. where correlation between basic science, patient and clinical information is pivotal for understanding. In this way, Storyline 360 and other similar interactive learning software are ideally placed to assist in learning clinical neuroscience. Other topics in this domain which would benefit from these features include lesions of both the peripheral nervous system and spinal cord and the neurological sequelae and investigations which are correlated with these.

Perhaps the most successful approach in terms of both learning efficacy and satisfaction, would be to take a blended model that mixes CAL with traditional neuroscience learning methods [14, 61, 62]. As this study indicates, while different forms of CAL can be equally effective – their cost and preparation are significantly different. This can help inform medical educators choose which method of CAL to use in their curriculum according to the resources (both financial and human) that are available and whether to target efficacy or student enjoyability. Further research should be conducted to gain a more in-depth understanding of: (i) how current medical students view and access available forms of CAL; (ii) which specific elements of CAL they find helpful and (iii) how these can be improved. Through qualitative research, the future implementations of CAL can be developed to better fit the needs of students.

Conclusion

This study shows that an interactive and virtual case-based computer-assisted learning in the form of an eModule on stroke and the cerebrovascular anatomy is perceived as more engaging and useful than a Wikipedia-style webpage with matching content. There was a significant decline in knowledge retention after both learning activities. However, their effectiveness in both short and long-term learning did not appear to differ. As the trend in medical-schools continues toward e-learning, these results are helpful in understanding where this software are best placed in their curricula. These results suggests that as a teaching supplement, a webpage with similar content can be as effective as more sophisticated modules. On the other hand, more enjoyable learning modules could motivate more students to be active self-directed learners. Educators should weigh up if these modules are cost-beneficial.

Availability of data and materials

The dataset used and analysed during the current study are available from the corresponding author on reasonable request.

Abbreviations

CAL:

Computer-assisted learning

eModule(s):

Electronic module(s)

References

  1. Butterfield A, Gerard EN, Kerr A. A Dictionary of Computer Science. 7th ed. Oxford, United Kingdom: Oxford University Press; 2016.

    Book  Google Scholar 

  2. Greenhalgh T. Computer assisted learning in undergraduate medical education. BMJ. 2001;322(7277):40–4.

    Article  Google Scholar 

  3. Shaikh F, Inayat F, Awan O, Santos MD, Choudhry AM, Waheed A, et al. Computer-Assisted Learning Applications in Health Educational Informatics: A Review. Cureus. 2017;9(8):e1559.

    Google Scholar 

  4. John LJ. A review of computer assisted learning in medical undergraduates. J Pharmacol Pharmacother. 2013;4(2):86–90.

    Article  Google Scholar 

  5. Baby L, Kavalakkat J, Abraham S, Sathianarayanan S. CAL: A modern tool for Pharmacology. Internet J of Medical Simulation. 2009;2(2):e160921190441.

    Google Scholar 

  6. Jaffe CC, Lynch PJ. Computer-aided instruction in radiology: opportunities for more effective learning. AJR Am J Roentgenol. 1995;164(2):463–7.

    Article  Google Scholar 

  7. Amesse LS, Callendar E, Pfaff-Amesse T, Duke J, Herbert WN. Evaluation of computer-aided strategies for teaching medical students prenatal ultrasound diagnostic skills. Med Educ Online. 2008;13(1):4482.

    Article  Google Scholar 

  8. Haq I, Dacre J. Computer-assisted learning in undergraduate and postgraduate rheumatology education. Rheumatology. 2003;42(2):367–70.

    Article  Google Scholar 

  9. Gorman PJ, Meier AH, Krummel TM. Computer-assisted training and learning in surgery. Comput Aided Surg. 2000;5(2):120–30.

    Article  Google Scholar 

  10. Svirko E, Mellanby J. Teaching neuroanatomy using computer-aided learning: What makes for successful outcomes? Anat Sci Educ. 2017;10(6):560–9.

    Article  Google Scholar 

  11. McKeough DM, Mattern-Baxter K, Barakatt E. Effectiveness of a computer-aided neuroanatomy program for entry-level physical therapy students: anatomy and clinical examination of the dorsal column–medial lemniscal system. J Allied Health. 2010;39(3):156–64.

    Google Scholar 

  12. Svirko E, Mellanby J. Attitudes to e-learning, learning style and achievement in learning neuroanatomy by medical students. Med Teach. 2008;30(9–10):e219–27.

    Article  Google Scholar 

  13. Berman NB, Fall LH, Maloney CG, Levine DA. Computer-Assisted Instruction in Clinical Education: a Roadmap to Increasing CAI Implementation. Can J Neurol Sci. 2008;13(3):373–83.

    Google Scholar 

  14. Lewis EC, Strike M, Doja A, Ni A, Weber J, Wiper-Bergeron N, et al. Web-based software to assist in the localization of neuroanatomical lesions. Can J Neurol Sci. 2011;38(2):251–5.

    Article  Google Scholar 

  15. Weverling GJ, Stam J, ten Cate TJ, van Crevel H. Computer-assisted education in problem-solving in neurology; a randomized educational study. Ned Tijdschr Geneeskd. 1996;140(8):440–3.

    Google Scholar 

  16. Elizondo-Omaña RE, Morales-Gómez JA, Guzmán SL, Hernández IL, Ibarra RP, Vilchez FC. Traditional teaching supported by computer-assisted learning for macroscopic anatomy. Anat Rec B New Anat. 2004;278(1):18–22.

    Article  Google Scholar 

  17. Hughes B, Joshi I, Lemonde H, Wareham J. Junior physician’s use of Web 20 for information seeking and medical education: a qualitative study. Int J Med Inf. 2009;78(10):645–55.

    Article  Google Scholar 

  18. Allahwala UK, Nadkarni A, Sebaratnam DF. Wikipedia use amongst medical students–new insights into the digital revolution. Med Teach. 2013;35(4):337.

    Article  Google Scholar 

  19. Back DA, Behringer F, Haberstroh N, Ehlers JP, Sostmann K, Peters H. Learning management system and e-learning tools: an experience of medical students’ usage and expectations. Int J Med Educ. 2016;20(7):267–73.

    Article  Google Scholar 

  20. Jozefowicz RF. Neurophobia: the fear of neurology among medical students. Arch Neurol. 1994;51(4):328–9.

    Article  Google Scholar 

  21. Burford C, Alexander E, Sloper W, Huett M. Factors influencing interest in the brain-related sciences in a UK cohort. J Neurol Sci. 2017;15(377):77–8.

    Article  Google Scholar 

  22. Pakpoor J, Handel AE, Disanto G, Davenport RJ, Giovannoni G, Ramagopalan SV. National survey of UK medical students on the perception of neurology. BMC Med Educ. 2014;14(1):225.

    Article  Google Scholar 

  23. Javaid MA, Chakraborty S, Cryan JF, Schellekens H, Toulouse A. Understanding neurophobia: Reasons behind impaired understanding and learning of neuroanatomy in cross-disciplinary healthcare students. Anat Sci Educ. 2018;11(1):81–93.

    Article  Google Scholar 

  24. Pani JR, Chariker JH, Naaz F, Mattingly W, Roberts J, Sephton SE. Learning with interactive computer graphics in the undergraduate neuroscience classroom. Adv Health Sci Educ Theory Pract. 2014;19(4):507–28.

    Article  Google Scholar 

  25. McColgan P, McKeown P, Selai C, Doherty-Allan R, McCarron M. Educational interventions in neurology: a comprehensive systematic review. Eur J Neurol. 2013;20(7):1006–16.

    Article  Google Scholar 

  26. Chhetri SK. E-learning in neurology education: Principles, opportunities and challenges in combating neurophobia. J Clin Neurosci. 2017;2017(44):80–3.

    Article  Google Scholar 

  27. Burford C, Guni A, Rajan K, Hanrahan J, Armitage M, Driscoll A, et al. Designing undergraduate neurosurgical e-learning: medical students’ perspective. Br J Neurosurg. 2019;33(1):79.

    Article  Google Scholar 

  28. Articulate Global Inc. Academic Pricing. 2020; Available at: https://articulate.com/pricing/academic. Accessed 09 Sept 2020.

  29. Rhcastilhos. Circle of Willis. 2014; Available at: https://en.wikipedia.org/wiki/File:Circle_of_Willis_en.svg. Accessed 17 Dec 2021.

  30. Anatomist90. Circle of Willis. 2011; Available at: https://commons.wikimedia.org/wiki/File:Circle_of_Willis_5.jpg Licence:https://creativecommons.org/licenses/by-sa/3.0/deed.en. Accessed 17 Dec 2021.

  31. Bland JM, Altman DG. Cronbach’s alpha. BMJ. 1997;314(7080):572–572.

    Article  Google Scholar 

  32. Statistics Kingdom. Normal, T - Sample size calculator. 2017; Available at: https://www.statskingdom.com/sample_size_t_z.html. Accessed 25 Feb 2022.

  33. Fox RD, Harvill LM. Self-assessments of need, relevance and motivation to learn as indicators of participation in continuing medical education. Med Educ. 1984;18(4):275–81.

    Article  Google Scholar 

  34. Dornan T, Hadfield J, Brown M, Boshuizen H, Scherpbier A. How can medical students learn in a self-directed way in the clinical environment? Design-based research. Med Educ. 2005;39(4):356–64.

    Article  Google Scholar 

  35. Fredricks JA, Filsecker M, Lawson MA. Student engagement, context, and adjustment: Addressing definitional, measurement, and methodological issues. Learn Instr. 2016;2016(43):1–4.

    Article  Google Scholar 

  36. Caris MG, Sikkens JJ, Kusurkar RA, van Agtmael MA. E-learning on antibiotic prescribing—the role of autonomous motivation in participation: a prospective cohort study. J Antimicrob Chemother. 2018;73(8):2247–51 (7/5).

    Article  Google Scholar 

  37. Paul C, Rose S, Hensley M, Pretto J, Hardy M, Henskens F, et al. Examining uptake of online education on obstructive sleep apnoea in general practitioners: a randomised trial. BMC Res Notes. 2016;9(1):350.

    Article  Google Scholar 

  38. Reid HJ, Thomson C, McGlade KJ. Content and discontent: a qualitative exploration of obstacles to elearning engagement in medical students. BMC Med Educ. 2016;16(1):188.

    Article  Google Scholar 

  39. Hudson JN. Linking neuroscience theory to practice to help overcome student fear of neurology. Med Teach. 2006;28(7):651–3.

    Article  Google Scholar 

  40. Kanthan R, Senger J. The impact of specially designed digital games-based learning in undergraduate pathology and medical education. Arch Pathol Lab Med. 2011;135(1):135–42.

    Article  Google Scholar 

  41. Wong G, Greenhalgh T, Pawson R. Internet-based medical education: a realist review of what works, for whom and in what circumstances. BMC Med Educ. 2010;10(1):12.

    Article  Google Scholar 

  42. Cook DA, Levinson AJ, Garside S. Time and learning efficiency in Internet-based learning: a systematic review and meta-analysis. Adv Health Sci Educ Theory Pract. 2010;15(5):755–70.

    Article  Google Scholar 

  43. Devitt P, Palmer E. Computer-aided learning: an overvalued educational resource? Med Educ. 1999;33(2):136–9.

    Article  Google Scholar 

  44. Cook DA, Beckman TJ, Thomas KG, Thompson WG. Introducing resident doctors to complexity in ambulatory medicine. Med Educ. 2008;42(8):838–48.

    Article  Google Scholar 

  45. Cook DA, Thompson WG, Thomas KG, Thomas MR, Pankratz VS. Impact of Self-Assessment Questions and Learning Styles in Web-Based Learning: A Randomized, Controlled. Crossover Trial Acad Med. 2006;81(3):231–8.

    Article  Google Scholar 

  46. Kopp V, Stark R, Fischer MR. Fostering diagnostic knowledge through computer-supported, case-based worked examples: effects of erroneous examples and feedback. Med Educ. 2008;42(8):823–9.

    Article  Google Scholar 

  47. Friedl R, Höppler H, Ecard K, Scholz W, Hannekum A, Oechsner W, et al. Comparative Evaluation of Multimedia Driven, Interactive, and Case-Based Teaching in Heart Surgery. Ann Thorac Surg. 2006;82(5):1790–5.

    Article  Google Scholar 

  48. Mattheos N, Nattestad A, Christersson C, Jansson H, Attström R. The effects of an interactive software application on the self-assessment ability of dental students. Eur J Dent Educ. 2004;8(3):97–104.

    Google Scholar 

  49. Reich J. Rebooting MOOC Research. Science. 2015;347(6217):34.

    Article  Google Scholar 

  50. Cotton DRE, Gresty KA. The rhetoric and reality of e-learning: using the think-aloud method to evaluate an online resource. Assess Eval High Educ. 2007;32(5):583–600.

    Article  Google Scholar 

  51. McLaughlin JE, Rhoney DH. Comparison of an interactive e-learning preparatory tool and a conventional downloadable handout used within a flipped neurologic pharmacotherapy lecture. Curr Pharm Teach Learn. 2015;7(1):12–9.

    Article  Google Scholar 

  52. Cook DA, Levinson AJ, Garside S, Dupras DM, Erwin PJ, Montori VM. Internet-based learning in the health professions: a meta-analysis. JAMA. 2008;300(10):1181–96.

    Article  Google Scholar 

  53. Kurup V. The New Learners—Millennials!! Int Anesthesiol Clin. 2010;48(3):13–25.

    Article  Google Scholar 

  54. Ruiz JG, Mintzer MJ, Leipzig RM. The impact of e-learning in medical education. Acad Med. 2006;81(3):207–12.

    Article  Google Scholar 

  55. Sharma K, Athauda D, Robbins E. A survey of undergraduate teaching of clinical neurology in the United Kingdom 2012. J Neurol Neurosurg Psychiatry. 2013;84(11):e2.

    Google Scholar 

  56. DiCarlo SE. Too much content, not enough thinking, and too little FUN! Adv Physiol Educ. 2009;33(4):257–64.

    Article  Google Scholar 

  57. Schon F, Hart P, Fernandez C. Is clinical neurology really so difficult? J Neurol Neurosurg Psychiatr. 2002;72(5):557.

    Article  Google Scholar 

  58. Flanagan E, Walsh C, Tubridy N. ‘Neurophobia’– attitudes of medical students and doctors in Ireland to neurological teaching. Eur J Neurol. 2007;14(10):1109–12.

    Article  Google Scholar 

  59. Brueckner JK, Traurig H. Students’ responses to the introduction of a digital laboratory guide in medical neuroscience. Med Teach. 2003;25(6):643–8.

    Article  Google Scholar 

  60. Marker DR, Juluru K, Long C, Magid D. Strategic improvements for gross anatomy web-based teaching. Anat Res Int. 2012;2012:146262.

    Google Scholar 

  61. Bye AME, Connolly AM, Farrar M, Lawson JA, Lonergan A. Teaching paediatric epilepsy to medical students: A randomised crossover trial. J Paediatr Child Health. 2009;45(12):727–30.

    Article  Google Scholar 

  62. Lim ECH, Ong BKC, Seet RCS. Using videotaped vignettes to teach medical students to perform the neurologic examination. J Gen Intern Med. 2006;21(1):101–101.

    Article  Google Scholar 

Download references

Acknowledgements

The authors wish to thank Dr Ahmad Guni, Dr Annabel Driscoll and Dr Charlotte Burford for their assistance in the development of the electronic module.

Funding

This work was supported by King’s Health Partners, London, United Kingdom. The funding source for this study did not contribute to the design, data collection or analysis of the study. The manuscript was not reviewed by the funding source.

Author information

Authors and Affiliations

Authors

Contributions

KR developed the webpage learning activity and conducted the data collection. KR and AP designed the study, developed the interactive learning activity, analysed the data and wrote the manuscript. KR and AP contributed to the discussion and approved the manuscript.

Authors’ information

KR is a current Academic Foundation Programme doctor at the Bristol Royal Infirmary in Bristol, United Kingdom, and a graduate from King’s College London. His research interests are medical education, neuroscience and critical care. AP is a current Academic Clinical Fellow and Specialist Registrar in Neurosurgery in the North Thames Deanery, London, United Kingdom, with research interests in neuroimaging, bioinformatics and surgical education.

Corresponding author

Correspondence to Kiran Kasper Rajan.

Ethics declarations

Competing interests

The authors declare no competing interests.

Ethics approval and consent to participate

Ethical approval for this study was registered by King’s College London as minimal risk (MRS-18/19–8122). All methods were carried out in accordance with relevant guidelines and regulations. Participation was voluntarily and none of the researchers were in a position of power or involved in the medical education of any participant. Informed consent was obtained through email from all participants. All data, including the quiz results, were only shared within the research team and not used for any purpose other than this study.

Consent for publication

Not applicable.

Competing interest

The authors report no conflicts of interest. The authors alone are responsible for the content and writing of the article.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1.

 Weblinks to the eModule and Wikipedia-like page. Screenshots of the eModule.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Rajan, K.K., Pandit, A.S. Comparing computer-assisted learning activities for learning clinical neuroscience: a randomized control trial. BMC Med Educ 22, 522 (2022). https://doi.org/10.1186/s12909-022-03578-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12909-022-03578-2

Keywords