Skip to main content

Online training in evidence-based medicine and research methods for GP registrars: a mixed-methods evaluation of engagement and impact



Evidence-based medicine (EBM) is a core skillset for enhancing the quality and safety of patients’ care. Online EBM education could improve clinicians’ skills in EBM, particularly when it is conducted during vocational training. There are limited studies on the impact of online EBM training on clinical practice among general practitioner (GP) registrars (trainees in specialist general practice). We aimed to describe and evaluate the acceptability, utility, satisfaction and applicability of the GP registrars experience with the online course. The course was developed by content-matter experts with educational designers to encompass effective teaching methods (e.g. it was interactive and used multiple teaching methods).


Mixed-method data collection was conducted after individual registrars’ completion of the course. The course comprised six modules that aimed to increase knowledge of research methods and application of EBM skills to everyday practice. GP registrars who completed the online course during 2016–2020 were invited to complete an online survey about their experience and satisfaction with the course. Those who completed the course within the six months prior to data collection were invited to participate in semi-structured phone interviews about their experience with the course and the impact of the course on clinical practice. A thematic analysis approach was used to analyse the data from qualitative interviews.


The data showed the registrars were generally positive towards the course and the concept of EBM. They stated that the course improved their confidence, knowledge, and skills and consequently impacted their practice. The students perceived the course increased their understanding of EBM with a Cohen’s d of 1.6. Registrars identified factors that influenced the impact of the course. Of those, some were GP-related including their perception of EBM, and being comfortable with what they already learnt; some were work-place related such as time, the influence of supervisors, access to resources; and one was related to patient preferences.


This study showed that GP registrars who attended the online course reported that it improved their knowledge, confidence, skill and practice of EBM over the period of three months. The study highlights the supervisor’s role on GP registrars’ ability in translating the EBM skills learnt in to practice and suggests exploring the effect of EBM training for supervisors.

Peer Review reports


Evidence-based medicine (EBM) is the integration of best available clinical evidence with clinical expertise and patient values to inform a clinical decision [1]. It is recognised as a core skillset for improving the quality and safety of health care [2]. This is acknowledged by the inclusion of EBM competencies in the curriculum for general practice education throughout the training continuum - from undergraduate learners to continuing professional development programs for established general practitioners (GPs) [3, 4].

Despite some evidence that GPs generally acknowledge the importance of EBM [5], it is not always practiced in routine care [6]. For instance, Australian GPs and GP registrars (trainees in specialist general practice) have been observed to prescribe antibiotics for respiratory tract infections in a non-evidence-based manner [7, 8].

There are a range of influences on the use of EBM in practice. Many of these factors are shared among GPs and GP registrars. Some of these barriers are at the GP-level, such as doubt about the applicability of the research evidence to practice [9, 10]) or lack of skills in finding, appraising and applying evidence [5, 6, 11,12,13]. Some are workplace-related such as the influence of previous practice or peers [9] and resource constraints (e.g. time pressure, access to resources such as reliable internet or subscriptions to sources) [5, 6, 11,12,13,14]. Patient factors also influence clinical practice. While patient-centred care is an important feature of high-quality care [15], navigating between EBM and patient preferences and beliefs for treatments that are unsupported by evidence, can be challenging [5, 6, 11,12,13]. In addition, there are some specific barriers related to GP registrars, for example, GP registrars often seek answers for their clinical questions, especially for more complex ones, [16] by consulting supervisors and colleagues [17]. This could be a barrier or facilitator to the practice of EBM, depending upon the supervisor [18, 19], Some studies also reported that the atmosphere established by supervisors (authoritarian vs collaborative) could be a barrier for GP registrars [20].

Medical education has a major role in preparing a workforce skilled in EBM, but its translation from learning to practice is challenging [21, 22]. Specific EBM teaching strategies such as clinically integrated training, problem-based learning, and e-learning can improve knowledge, attitudes, and skills in undergraduate medical students [21]. However, there is limited evidence that these strategies directly influence clinical practice [21].

Internet-based learning activities offer a range of advantages, such as ease of access from various settings and for a large audience [23]. Online training can be efficient compared to the traditional face-to-face methods as it allows distance learning when local training opportunities and resources are limited; provides convenience and flexible learning; reduces travel time and expenses for learners [24]. It can be beneficial and time-efficient when trainees have diverse ranges of background knowledge and customised course content is needed [25]. These features make online training a pragmatic method for training medical professionals in full-time clinical practice. A recent review of 14 studies concluded that online training can be as effective as other alternative methods for increasing knowledge and improving clinical practice of medical professionals [23].

While online learning has potential benefits, it must engage and empower learners for the learnings to be applied in the real world. Boettcher’s review of pedagogical theory and research identified the following core principles for effective teaching (traditional and technology-enhanced): that teaching should be interactive, customised to learners’ background knowledge, deliver information in the form of organised chunks, and provide opportunities for learners to write, explain and analyse [26]. EBM educational interventions that use a variety of learning methods (e.g. video, written materials) are more likely to have a significant effect on learning EBM than those that used a single method [27, 28]. More broadly, training should be relevant to practice [29], case-based [30, 31] and implemented when learners are exposed to clinical cases to practice the skills learnt [29].

While there is abundant evidence for the efficacy of online learning, the teaching of EBM is a singular area. There is a good deal of theoretical understanding to be imparted, but also a complex application of that theory to practical use (our course was designed to equip and encourage our registrars to apply EBM techniques to their clinical practice). There is a modest amount of evidence for online EBM courses [21, 32,33,34,35]. We are not aware of studies of online EBM training for GPs. Teaching EBM to GPs and GP vocational trainees is teaching in a unique and problematic environment. GPs must function in an environment of much greater diagnostic uncertainty than medical specialty practice. This creates particular difficulties in the application of evidence to practice. Also, much of the evidence on which GPs must rely in practising EBM has been derived in non-GP populations (specialist or hospital practice). Interpreting evidence and applying it to practice provides another difficult layer to teaching EBM to GPs and GP registrars. Finally, there is still a sizable minority of GP registrars who have considerable indifference to, if not resistance to, EBM. A preference for personal experience or the opinion of superiors rather than research-derived evidence is quite common and may be fostered in some environments within the apprenticeship model training system of Australian GPs. Our course addressed these issues of the importance and relevance of practising EBM in general practice, how EBM can be practised in an environment of heightened diagnostic uncertainty, and how evidence should be critically appraised for its relevance to a particular general practice patient.

The study objectives

This research explored GP registrars’ experience with an online interactive training course that aimed to increase knowledge of research methods by GP registrars and, ultimately, may be able to produce practitioners who can use evidence in practice. The findings of this study not only can be used for the improvement of the current training course but also for informing future similar course development by identifying its efficacy, strengths and limitations. The study’s research questions were:

  1. 1.

    What were the GP registrars’ experiences with the course (including how engaged were they with the course)?

  2. 2.

    What were the impacts of the course on GP registrars’ attitudes, knowledge, skills and clinical practice?

  3. 3.

    What factors influenced the impact of the course on the GP registrars’ clinical practice?



The evaluation was a one-group design (post-test only) conducted after completion of the course, using a triangulation mixed methods design.

Recruitment and sample

Participants were GP registrars who completed the online course. GP registrars are required to complete the course by the end of term two of the training. GP registrars who completed the course during Nov 2016-Feb 2020 (n = 1142) were invited to fill in an online survey via a link within the course.

All GP registrars who completed the course within the six months prior to data collection (2019–2020) were invited to participate in semi-structured interviews. Registrars were invited to express interest in being interviewed for the study via:

  1. a.

    An invitation within the online survey used for course satisfaction data collection.

  2. b.

    An invitation from GP Synergy, sent to registrars on completion of the course and to registrars who completed the course since 1st January 2019

  3. c.

    An invitation within the online course.

Sampling was purposive, based on potential participants’ characteristics of medical training (Australia vs overseas) and GP training pathway (general vs rural).


An online education course (“the course”) was designed and built for GP registrars to complete as a compulsory component of their vocational specialist training. This was commissioned by GP Synergy Ltd., the government-funded vocational GP regional training organisation in New South Wales and the Australian Capital Territory, Australia. The reason for developing the course was that GP Synergy needed a scalable method of teaching GP registrars (annual intake of 500 registrars) spread over the whole of New South Wales and the Australian Capital Territory (an area over 800,000 km2).

The course was developed by content and education experts from UNSW Australia (including academic GPs) and an interactive software developer (Smart Sparrow Pty Ltd) to ensure modules were educationally sound.

The course was designed to provide a grounding in research (quantitative and qualitative), epidemiology and critical evaluation and how evidence from these sources is incorporated into clinical practice: that is, the course sought to present the knowledge required to practice as an evidence- based practitioner. The overarching aim of the course was “to have registrars using evidence and practising EBM in their everyday practice” [36]. It comprised six modules of critical thinking and ethical principles in human research, critical appraisal skills, exploring the evidence on a clinical question including systematic review and meta-analysis, clinical epidemiology, quantitative research methods and qualitative research methods. Each module was divided into three-to-six lessons with lesson-specific learning objectives (see Additional file 1, Table S1). The course aimed to enable participants to:

  1. 1.

    identify links between research and practice

  2. 2.

    apply the research findings to examples from practice

  3. 3.

    describe the principles of research design

  4. 4.

    participate in research.

The modules included text with visual learning methods such as graphs and images, video lectures, links to publicly accessible video resources, interactive exercises (e.g. multiple choice), clinical scenarios and further reading. To relate the course content to GP registrars’ daily practice, clinical scenarios were included in the lessons. Each module was expected to take about an hour to complete, not counting additional reading. Fig. S1 illustrates some examples of the course screens on how interactivity and engaging elements delivered.

Instruments and data collection

The online survey asked participants to indicate the degree to which they agreed with 11 statements that described their experience with the course using a 6-point rating scale; rate their understanding of the topic before the course and after completing the course on a scale from 0 to 10; and answer two open-ended questions: 1) what they liked most about the course and 2) what they would like to see changed. Participants self-completed the online survey via the link provided in the course.

The semi-structured interview schedule (Table 1) contained two sections:

  1. 1.

    Experience with the course

  2. 2.

    Impact of the course on clinical practice.

Table 1 Interview questions

Interviews were conducted by telephone in one or two sessions, depending on the course completion time:

  1. 1.

    Two sessions: If the Registrar had completed the course less than one month before recruitment, the first interview was restricted to section one of the interview schedule. A second interview to complete section two of the interview schedule was conducted three months later.

  2. 2.

    One session: If the Registrar had completed the course at least three months before recruitment both sections of the interview schedule were completed in the one interview.

The timing was designed to allow participants the opportunity to apply the knowledge and skills gained during the course before answering the questions in section two about how the course influenced clinical practice.

The interviews were conducted via telephone by a trained Research Officer [HR-A] who had not been involved in the course development. The average times to complete the interviews were 23 min (first interviews) and 19 min (second interviews).

Data analysis

Descriptive statistics were used to present quantitative data from the online survey. The difference between the self-report level of understating of the topic before and after the course was tested using paired t-test and the effect size compared using Cohen’s d. Data analysis was performed using SPSS (version 23). The answers to open-ended questions were coded and summarised. These were then discussed between authors [CS, HR-A] and grouped into broader categories and sub-categories based on their similarities and differences.

Data collection and analysis were concurrent and iterative for interviews. The interviews were transcribed verbatim and coded using the software program NVivo (Version 12 Pro). A thematic analysis approach was used [37] to analyse the data from qualitative interviews. The initial few transcripts of both the first and second interviews were open coded by pairs of researchers and the codes compared and discussed among five authors [CS, MH, HR-A, CT, PM]. Later codes were updated based on the discussion, and the rest of the transcripts were coded by one of the authors [HR-A]. In all steps of coding, the analyst constantly refined the earlier codes in the light of newly emerged codes. When all data were coded, the generated themes and codes were again discussed among all authors, until consensus was reached. We aimed to achieve thematic saturation but due to COVID-19 (as GPs, their capacity to find free time for the interview was affected by the high demand for health professionals as well as the stress and upheaval relating to the education program moving suddenly online and their practices’ structural responses to the pandemic), data collection was ceased before the data saturation was complete. There was still good saturation in most of the themes.

To ensure trustworthiness, an audit trail of all the steps and decisions made during the research process were recorded [38]. The researcher triangulation method [38] was used by involving multiple analysts to bring different perspectives into the findings. Verbatim quotations of various participants were provided to support the study findings. Considering reflexivity, most of the authors (NZ, MH, CT, PM, TL) were academic GPs in current clinical practice, and all authors except one (HR-A) were experienced and involved in activities relevant to teaching EBM to GPs. All authors had prior experience in analysing qualitative data. The main analyst [HR-A] kept a reflexivity journal during analysis to discuss it with all authors.

When data analyses were completed, qualitative and quantitative findings were synthesised to answer the study questions. Codes were reviewed for patterns, and they were further revised and developed into themes to answer the research questions. The final themes are presented in Fig. 3 in the result section.



Of 1142 GP registrars who invited to take part in the online survey, 391 (34%) completed the survey between Nov 2016-Feb 2020. Twelve GP registrars completed interviews. Details of the recruitment process and data collection are provided in Fig. 1.

Fig. 1
figure 1

Recruitment and interview process

The interview sample comprised nine women, five men; two registrars enrolled in the rural pathway, and only one registrar who had completed medical training overseas. The interviewees enrolled in the course during the second or last year of their three-year (full-time equivalent) program. Participants stated that they spent an average of nine hours to complete the whole course.

Results are presented to answer each research question.

RQ1. Participant experience with the course

Registrars’ responses to 11 questions on satisfaction with the course are presented in Fig. 2.

Fig. 2
figure 2

Participants ratings of the course using a 6-point rating scale (n = 391)

Data from the online survey and the interviews indicated that participants’ course experience was generally positive. As illustrated in Table 2, the triangulated methods provided some unique and some common feedback.

Table 2 Participant experience feedback from online survey (n = 391) and interviews (n = 14)

The responses to online questions on registrar satisfaction were mostly in the upper half of the rating scale, suggesting general satisfaction (Fig. 2). Consistent with the qualitative data, the participants highly rated the ease of navigation (Q.6), the flexibility of the online delivery (Q.5) and the feedback (Q.11).

In contrast to the positive responses to Q.6, the open-ended question about what participants would like changed (Table S2) identified that many participants had problems with technical issues, and some thought the interface could have been more user friendly.

Data from the open-ended questions (Additional File 1, Tables S2 and S3) and interviews (Table 2) provided additional feedback on the course content: that it was easy to understand, provided relevant examples, it was relevant to practice, and it provided useful resources for future use.

RQ2. Impacts of the course on attitudes, knowledge skills and clinical practice

Data from the survey and the interviews indicated that the participants thought the course positively increased their confidence, knowledge and skills of EBM. Interview data (presented below) suggested the course changed the participant’s clinical practice to better incorporate EBM.


Most participants said they were only interested in the application of research into practice. Only a few participants said that they their interest in doing or participating in research. Was influenced by taking the course.

Participants mostly found themselves more confident in understanding, interpreting and appraising research evidence after completing the course.

I feel more confident, and I think that I would be able to if I needed to look through a study and make some comments [#08-female].

However, they acknowledged that they might not be able to conduct research, or there is room for improvements.

I probably feel about the same as I did before. I don’t know that I feel any more or less confident [#10-female].


An increase in knowledge was identified in the interview and the online survey. Based on the survey findings, participants’ self-reported understanding of the topic substantially improved from the mean of 4.4 (out of 10) ± 0.1 (SD) before the course to 7.2 ± 0.1 after completing the course (n = 320, p<0.0001) which represents a very large effect size (d = 1.6). Similar findings found from the interview.

It’s a good idea I think to have an understanding of research when we’re going out into clinical practice [#05-female].

It [the course] makes me more aware to incorporate research into my practice [#07-male].

In addition, participants acknowledged that the course, led them to start questioning practice where clinicians follow others’ experience/opinion without thinking critically.

The way it’s [the course] influenced my management is just always being aware that just because it’s been done as usual practice doesn’t necessarily mean like it’s evidence-based and doesn’t mean that it’s necessarily proven to be effective [#01-male].

One participant highlighted an improvement in her understanding of the importance of contextual and environmental factors, including patient preferences in applying evidence.

It’s not just whether or not they’re sick, it’s also about how they approach health providers, … So, I just keep it in the back of my mind that we need to be aware of the other socioeconomic and environmental factors which a lot of that qualitative research helps us consider as well [#14-female].


Some participants said they were now able to critique research evidence and interpret research studies.

I think it’s just helpful to …being able to interpret and synthesise how that can apply [research] to your clinical practice [#03-female].

Some participants reported that they learnt how to frame their clinical questions and find an evidence-based answer for it.

Instead of thinking that you don’t know something, and letting that overwhelm you, you become a little bit better at devising a clinical question and knowing where to answer is [#13_2-male].

Clinical practice

Some participants reported that the skills that they learnt had led to some changes in their practices. They used the skills learnt and research evidence to investigate an answer to their clinical questions particularly when they were dealing with uncertainty such as dealing with a complex case or when a patient queries about a treatment that they have not heard of. They acknowledged that they used skills such as clinical appraisal, particularly the levels of evidence [#3,4,7,14] to interpret study findings, findings that drug representatives presented to them, and guidelines.

Looking at evidence and working out how reliable this is, and then using that to, you know, guide treatment or, you know, not needing to always rely on guidelines to make a better judgment, depending on the clinical situation after reviewing what evidence is available [#08-female].

A few participants indicated that the skills that they learnt in the course reaffirmed the importance of communication skills with patients. Their improved understanding of research findings and being able to elaborate the difference between high quality and low-quality evidence to patients reportedly improved their communication with patients, particularly in instances when a patient had a query regarding different treatments and medications.

I do bring it back to research and how just because one thing works for someone doesn’t mean that it works with you and just break it down that way, and also talking like, high-quality studies versus poor quality as evidence [03-female].

RQ3: Mediators of impacts (reported barriers and facilitators)

Responses from the qualitative interviews indicated that barriers and facilitators to practicing EBM related to the GP (GPs perceptions of EBM, comfort and priority); the work-place (time, the influence of supervisors, the impact of system and access to resources); and patients (treatment expectation being different from evidence).

GP factors

Most of the participants had positive attitudes towards EBM and acknowledged the value of EBM.

Obviously, we have to practice evidence-based medicine, so in order to do so, we need to be able to understand and interpret and incorporate research into our practice [#07-male].

Almost all participants reported they needed to seek information to inform their decisions on a daily basis. They reported that they tried to choose EBM resources that were recent and relevant to the Australian context and had confidence in the quality of the information.

All of them [the guidelines that I use] are sort of peer-reviewed and accepted by the wider community as factually correct [#11_2-male].

A few participants expressed a different opinion and described research evidence as neither relevant nor transferable into clinical practice.

I think they’re [research evidence] just answering questions that are quite different from the questions that we get in general practice [#06-female].

However, a few participants described an approach based upon trust in the credentials of the source of information:

I do feel if it’s [a research findings] published in a reputable source, I tend to leave it without thinking too critically… I trust my supervisor and feel that they are quite competent [#05-female].

A barrier that was described was, despite that the participants were willing to change, they felt more comfortable with what they have already learnt and get accustomed to, than using new evidence-based resources.

Often you’re introduced to something like an UpToDate [a resource for supporting clinical decision] quite early, so you get good at searching and using it, that you know what sort of services are on it [#13_2-male].

Work-place factors

Time was one of the main barriers identified for accessing and using research evidence in practice by almost all participants. The time-consuming nature of using research evidence was attributed to the way it was accessed, (for example, an initial need for login to the webpage of some organisations such as RACGP), the overwhelming amount of information identified by searches, and the time needed to critically appraise the findings.

If I were to go through Cochrane and look up a whole bunch of different articles which would take a lot longer [#05-female].

For almost all registrars, pre-appraised resources that are brief and ready to use, such as guidelines, were preferred over primary research evidence.

I would tend to use resources that are incorporated study findings into a summary like eTG [online Therapeutic Guidelines], I don’t read the specific articles and therefore analyse the data’ [#5-female].

Participants’ responses indicated that using research evidence might not be their priority; work and exams were specified as activities that they prioritised.

Because I do have exams coming up I haven’t been able to do – look into research papers [#11_2-male].

Participants reported that supervisors could be role models influencing GP registrars’ beliefs about evidence-based medicine by encouraging and guiding GP registrars to practice evidence-based medicine. Participants who said their supervisors encouraged them to use research evidence had a stronger belief than other participants about the applicability of research into practice.

Well, I think evidence-based research should inform good clinical practice, and it should always be the starting point for good management… Well, my supervisors have mostly been very evidence-based as well, so I’ve actually just learnt a lot from how they appraise studies [#03-female].

Some participants also indicated that some supervisors expect GP registrars to follow their advice and treatment approach without critically evaluating the relevance of the advice into the clinical situation.

[one of my supervisors is a] real old-school doctor, right, so they are less likely to change or read the literature or – this is the way we’ve always done it, so that’s the way they’ll always do it. So that’s the way they want me to do it as well. Which may not be – it’s not dangerous, but it may not be an optimal solution [#11_2-male].

Patient factors

Patient expectation for particular treatment options that were not supported by evidence was reported to be a barrier to practicing EBM.

That’s mainly patient preference, so potentially they don’t want to go with the evidence-based therapy [#04-male].


Figure 3 illustrates the main themes identified to answer the study questions and their interactions. The interactions were interpreted from the qualitative data and whether the specified changes and influences were related to the course. In summary, participants had a positive experience from the course and stated that taking the course led them to improve their, confidence, knowledge, and skills.

Fig. 3
figure 3

Participants’ experience with the course, perceived course impacts and factors that influenced the outcomes

Participants specified that they used the skills learnt in the course for interpreting evidence, investigating their clinical questions, and the course led them to have better communication with patients in their clinical practice. The factors that influenced the course’s impact were specified to be related to GP, work-place, and patient factors.


This study evaluated GP registrars’ experience with an online training course that aimed to increase their knowledge of research methods and the application of critical appraisal. The course was a compulsory component of their vocational specialist training developed by content and educational-design experts. The data showed the registrars were generally positive towards the course and the concept of EBM. They stated that the course improved their confidence, knowledge, and skills and consequently impacted how they practice. Particularly, participants specified that they used the skills learnt in the course for investigating their clinical questions, interpreting evidence, and communicating with patients in clinical practice. The self-reported improved understanding of the topic increased with a very large effect size (d = 1.6).

Participants in the online survey and qualitative interview reported that they expected a course to be interactive, relevant to practice and multi-media. For example, the course videos were appreciated by participants and acknowledged that it made the course engaging. This is aligned with research findings on EBM education [27,28,29,30,31] and principles of effective teaching [26] that support interactive and case-based learning. For example, Ilic et al. 2015 reported that participants preferred the YouTube videos and found it engaging compared to the lectures delivered for EBM teaching [39]. Similarly, Cranney 2001 reported that GPs who participated in an educational program for evidence-based hypertension management preferred topics that were relevant to their daily practice [11].

Registrars identified multiple barriers and facilitators for implementing EBM. Of those, some were GP-related factors including GPs’ perception about EBM and comfort, some were work-place and system related, including the influence of supervisors, time and access to resources, and one was related to patient preferences. These factors identified in this study were similar to previous research influencing EBM practice [5, 6, 11, 12]. In this study only one participant found PBS as a system barrier for prescribing new evidence-based medicines; however, this has not been identified in earlier studies. Below we further focused and discussed the factors that might be addressed by educational interventions.

Given that GPs’ practice behaviours are established early [40] and remain stable over time, ensuring a supportive work environment for the practicing of EBM is likely to be important for the translation of learning from the course to practice. The influence of supervisors can be utilised to facilitate promoting EBM in practice in future educational interventions. In the current study, participants who were positive about the EBM stated that their supervisors had advised using research evidence in their practice. One of the ways that GPs registrars learn about prescribing is through teaching sessions and discussions with supervisors [41]. Previous studies have also shown that GP registrars’ practices are likely to be influenced by the decisions of senior supervisors [18], and their practicing behaviours are likely to be similar to their supervisors [19]. In addition, the relationship between supervisors and GP registrars can have a role in the process of practicing EBM. For example, in the current study, some registrars stated that supervisors wanted them to follow their advice without critically evaluating it. This could be a challenging situation for GP registrars because of the differential power and experience between registrars and supervisors. Similar to the current finding a study on barriers for practicing EBM among GP registrars also identified that registrars might lose their motivation to seek an answer for their clinical questions if the supervisor does not provide them sufficient autonomy [20].

Thus, further research might examine the effect of courses that incorporate discussion of what might GP registrars face in practice and how to deal with it. Research on interventions for GP supervisors to facilitate and encourage the practice of EBM by their registrars could also be beneficial. Given that discussion sessions between registrars and supervisors are an opportunity for registrars to learn, collaborative learning discussions can be utilised for development and translation of EBM skills into practice while creating a safe, equal and collaborative environment between registrars and supervisor [42]. A previous study reported that this approach was useful for promoting EBM [42].

The effectiveness of future EBM training courses might be improved by providing strategies for responding to patient preferences that are at odds with research evidence. For example, a non-randomised controlled trial showed that GP registrars, who took part in online training on improving communication with patients, prescribed significantly fewer antibiotics compared to the control group [43].

Relevance of theory to the study findings

As the objective of the course was to change behaviours of registrars, we considered our results in relation to two frameworks developed for behaviour change. We first compared our findings with theoretical domains proposed to identify the process specifically involved in changing the behaviour of the health professionals for practicing EBM [44]. This framework identified specific domains that were directly relevant to the implementation of EBM training. Then, we applied an overarching framework -the behaviour change wheel (BCW)- which had been developed for improving the design and implementation of interventions for behaviour change in general [45].

Michie et al. (2005) reviewed the evidence on motivation theories (e.g. Social Cognitive Theory), action theories (e.g. Learning Theory and Organisation Theories (e.g. Goal Theory; Michie et al., 2005) for explaining the process of behaviour change. Twelve domains were identified, nine of which were relevant to our findings. Table 3 illustrates our study findings across these nine domains.

Table 3 Theoretical domains underpinning the study findings

In addition, relating the behaviour change wheel (BCW) framework [45] with our findings suggests that the course was able to address two out of three essential conditions of ‘capability’, ‘opportunity’, and ‘motivation’ required for behaviour change. By providing two intervention functions (education and training), the course increased the GP registrars’ knowledge, understanding and skills of EBM. Consequently, it increased the registrar’s ‘capability’ (physical and psychological) to practice EBM. It also provided an ‘opportunity’ (physical) for registrars to practice what they had learnt by being implemented when learners were exposed to clinical cases (environmental restructuring). However, the conditions of ‘motivation (both automotive and reflective)’ and the ‘psychological opportunity’ were not identified as part of the course impact. According to the BCW framework, the two intervention functions of modelling and persuasion (using communication to induce positive or negative feelings) can be used to increased automatic and reflective motivation [45]. Considering the influence of supervisors on registrars [19], they can serve as a role model for registrars to deliver modelling function. In addition, by reducing barriers and increasing means (e.g. behavioural support) the future course can increase opportunities and reflective motivation [45].


The current study suggests an engaging online course that is relevant to GP’s daily practice can be used as appropriate training for EBM among GP registrars. Supervisors could have a considerable impact on registers’ motivation and ability to translate EBM skills into practice by encouraging registrars and actively engaging in an equal, interactive discussion around EBM. Future studies might explore the addition of simultaneous supervisor training. In addition, future EBM learning courses might consider training registrars on how to respond to patients if their preferences are at odds with research evidence. Our findings were consistent with Mitchie’s nine domains suggested for behaviour change of health professionals. It suggests that online course may be effective if improves knowledge, skills/ability, confidence and beliefs about consequences. Some factors may act as barriers or enablers such as how GPs see themselves in their professional role, memory, routine, social influences or environmental context and resources.


There are several limitations to this study that should be acknowledged. The study only provided data on learner perspectives after finishing the course. Objective measures of knowledge, attitudes and behaviours were not collected from a control group that received no training or training with different type of educational modality. Also, data were notcollected before the intervention to measure changes over time. Consequently, the study is vulnerable to social desirability (telling us what we want to hear) and recall biases.

The online survey achieved a response rate of only 34%, which does bring issues of responder bias. This is, however, quite a reasonable response rate for a survey of GPs without incentives [46].

The telephone interviews sample was homogeneous and lacking in a full spectrum of variation. In particular, only one student was trained overseas, and the experience of overseas-trained students could be different to those of locally trained students. Since the data collection ceased before the data saturaation, we might have missed some information. However, there was good saturation in most of the themes. Further, the study used triangulation to compare results from the survey and qualitative interviews and found them generally consistent.


This study showed that an online interactive and multi-media training that aimed to enable GP registrars to use research evidence in their practice changed registrars’ knowledge, confidence, skill and behaviour of EBM. The findings of this study can assist future educational interventions to be effective. We conclude that an interactive and engaging online course with relevant practical scenarios can be effective in teaching EBM. Further, this study highlights the importance of the supervisor’s role in GP registrars’ ability in translating the EBM skills learnt in to practice. Thus, studies might want to explore how educating supervisors can facilitate their role in supporting registrars to practice EMB. Incorporating training on how registrars can engage with patients regarding evidence might also be considered in future EBM course developments. In addition, considering theoretical frameworks for developing interventions suggest that investigating the role of motivation, emotion and behavioural change techniques on practicing EBM might be beneficial.

Availability of data and materials

The datasets generated and analysed during the current study are not publicly available because consent was not obtained from study participants for data to be made public but are available from the corresponding author on reasonable request subject to approval from the Human Research Ethics Committee at UNSW Sydney.



Evidence-based medicine


general practitioner


the Pharmaceutical Benefits Scheme


  1. Sackett DL, Rosenberg WMC, Gray JAM, Haynes RB, Richardson WS. Evidence based medicine: what it is and what it isn’t. BMJ. 1996;312(7023):71–2.

    Article  Google Scholar 

  2. Albarqouni L, Hoffmann T, Straus S, Olsen NR, Young T, Ilic D, et al. Core Competencies in Evidence-Based Practice for Health Professionals Consensus Statement Based on a Systematic Review and Delphi Survey. Jama Netw Open Internet. 2018;1(2):e180281.

  3. Training curriculum Internet. Australian college of rural and remote medicine. [cited 2020 Oct 1]. Available from:

  4. RACGP 2016 Curriculum Internet. RACGP. [cited 2020 Oct 1]. Available from:

  5. Young JM, Ward JE. Evidence-based medicine in general practice: beliefs and barriers among Australian GPs. J Eval Clin Pract. 2001;7(2):201–10.

    Article  Google Scholar 

  6. Zwolsman S, te Pas E, Hooft L, Wieringa-de Waard M, van Dijk N. Barriers to GPs’ use of evidence-based medicine: a systematic review. Brit J Gen Pract Internet. 2012;62(600):e511–21. Available from:

  7. Dallas A, van Driel M, Morgan S, Tapley A, Henderson K, Oldmeadow C, et al. Antibiotic prescribing for acute otitis media and acute sinusitis: a cross-sectional analysis of the ReCEnT study exploring the habits of early career doctors in family practice. Fam Pract. 2017;34(2):180–7.

    Article  Google Scholar 

  8. Magin PJ, Morgan S, Tapley A, Davis JS, McArthur L, Henderson KM, et al. Reducing general practice trainees’ antibiotic prescribing for respiratory tract infections: an evaluation of a combined face-to-face workshop and online educational intervention. Educ Prim Care. 2016;27(2):98–105.

    Article  Google Scholar 

  9. McColl A, Smith H, White P, Field J. General practitioners’ perceptions of the route to evidence based medicine: a questionnaire survey. BMJ. 1998;316(7128):361–5.

    Article  Google Scholar 

  10. Te Pas E, van Dijk N, Bartelink MEL, Wieringa-De WM. Factors influencing the EBM behaviour of GP trainers: a mixed method study. Med Teach. 2013;35(3):e990–7.

    Article  Google Scholar 

  11. Cranney M, Warren E, Barton S, Gardner K, Walley T. Why do GPs not implement evidence-based guidelines? A descriptive study. Fam Pract. 2001 Aug;18(4):359–63.

    Article  Google Scholar 

  12. Taylor J, Wilkinson D, Blue IA, Dollard JT. Evidence-based rural general practice: barriers and solutions in South Australia. Rural Remote Health. 2005/05/07 ed. 2002;2(1):116.

  13. van Dijk N, Hooft L, Wieringa-de WM. What are the barriers to residents’ practicing evidence-based medicine? A systematic review. Acad Med. 2010 Jul;85(7):1163–70.

    Article  Google Scholar 

  14. Mahmoud MA, Laws S, Kamel A, Al Mohanadi D, Al Mohammed A, Mahfoud ZR. Examining aptitude and barriers to evidence-based medicine among trainees at an ACGME-I accredited program. BMC Med Educ. 2020;20(1):414.

  15. Australian Commission on Safety and Quality in Health Care. Patient-centred Care - Improving quality and safety through partnerships with patients and consumers Internet. Sydney: ACSQHC; 2011. [cited 2020 Sep 3]. Available from:

  16. Magin P, Morgan S, Wearne S, Tapley A, Henderson K, Oldmeadow C, et al. GP trainees’ in-consultation information-seeking: associations with human, paper and electronic sources. Fam Pract. 2015 Oct 1;32(5):525–32.

    Article  Google Scholar 

  17. Kortekaas MF, Bartelink MLEL, Boelman L, Hoes AW, de Wit NJ. General practice trainees’ information searching strategies for clinical queries encountered in daily practice. Fam Pract. 2015 Oct;32(5):533–7.

    Article  Google Scholar 

  18. Dallas A, van Driel M, van de Mortel T, Magin P. Antibiotic prescribing for the future: exploring the attitudes of trainees in general practice. Br J Gen Pract. 2014 Sep;64(626):e561–7.

    Article  Google Scholar 

  19. Deckx L, Anthierens S, Magin PJ, Morgan S, McArthur L, Yardley L, et al. Focus on early-career GPs: qualitative evaluation of a multi-faceted educational intervention to improve antibiotic prescribing. Fam Pract. 2018;35(1):99–104.

    Article  Google Scholar 

  20. Green ML, Ruff TR. Why do residents fail to answer their clinical questions? A qualitative study of barriers to practicing evidence-based medicine. Acad Med. 2005 Feb;80(2):176–82.

    Article  Google Scholar 

  21. Ahmadi SF, Baradaran HR, Ahmadi E. Effectiveness of teaching evidence-based medicine to undergraduate medical students: a BEME systematic review. Med Teach. 2015 Jan;37(1):21–30.

    Article  Google Scholar 

  22. Horsley T, Hyde C, Santesso N, Parkes J, Milne R, Stewart R. Teaching critical appraisal skills in healthcare settings. Cochrane Database Syst Rev. 2011/11/11 ed. 2011;(11):CD001270.

  23. Richmond H, Copsey B, Hall AM, Davies D, Lamb SE. A systematic review and meta-analysis of online versus alternative methods for training licensed health care professionals to deliver clinical interventions. BMC Med Educ. 2017 Nov 23;17(1):227.

    Article  Google Scholar 

  24. Curran V, Lockyer J, Sargeant J, Fleet L. Evaluation of Learning Outcomes in Web-Based Continuing Medical Education. Acad Med. 2006;81(Suppl):S30–4.

  25. Cook DA, Beckman TJ, Thomas KG, Thompson WG. Adapting web-based instruction to residents’ knowledge improves learning efficiency: a randomized controlled trial. J Gen Intern Med. 2008 Jul;23(7):985–90.

    Article  Google Scholar 

  26. Boettcher JV. Ten Core Principles for Designing Effective Learning Environments: Insights from Brain Research and Pedagogical Theory. Innovate: Journal of Online Education Internet. 2007;3(3):Article 2. [cited 2020 Jul 15]. Available from:

  27. Hatala R, Guyatt G. Evaluating the teaching of evidence-based medicine. JAMA. 2002 Sep 4;288(9):1110–2.

    Article  Google Scholar 

  28. Hatala R. Is evidence-based medicine a teachable skill? Ann Emerg Med. 1999 Aug 1;34(2):226–8.

    Article  Google Scholar 

  29. Rolfe IE, Sanson-Fisher RW. Translating learning principles into practice: a new strategy for learning clinical skills. Med Educ. 2002;36(4):345–52.

    Article  Google Scholar 

  30. McLean SF. Case-Based Learning and its Application in Medical and Health-Care Fields: A Review of Worldwide Literature. J Med Educ Curr Dev. 2016;3:JMECD.S20377.

    Article  Google Scholar 

  31. Thistlethwaite JE, Davies D, Ekeocha S, Kidd JM, MacDougall C, Matthews P, et al. The effectiveness of case-based learning in health professional education. A BEME systematic review: BEME guide no. 23. Med Teach. 2012;34(6):e421–44.

    Article  Google Scholar 

  32. Fieschi M, Soula G, Giorgi R, Gouvernet J, Fieschi D, Botti G, et al. Experimenting with new paradigms for medical education and the emergence of a distance learning degree using the internet: teaching evidence-based medicine. Med Inform Internet Med. 2002;27(1):1–11.

  33. Kulier R, Hadley J, Weinbrenner S, Meyerrose B, Decsi T, Horvath AR, et al. Harmonising evidence-based medicine teaching: a study of the outcomes of e-learning in five European countries. BMC Med Educ. 2008 Apr 29;8(1):27.

    Article  Google Scholar 

  34. Huang H-L, Chou C-P, Leu S, You H-L, Tiao M-M, Chen C-H. Effects of a quasi-experimental study of using flipped classroom approach to teach evidence-based medicine to medical technology students. BMC Med Educ. 2020 Jan 31;20(1):31.

    Article  Google Scholar 

  35. Kulier R, Gülmezoglu AM, Zamora J, Plana MN, Carroli G, Cecatti JG, et al. Effectiveness of a clinically integrated e-learning course in evidence-based medicine for reproductive health training: a randomized trial. JAMA. 2012 Dec 5;308(21):2218–25.

    Article  Google Scholar 

  36. GP Synergy Research and Evaluation Strategic Plan Internet. Sydney: GP Synergy; 2017 [cited 2020 Sep 3]. Available from:

  37. Braun V, Clarke V. Using thematic analysis in psychology. Qual Res Psychol. 2006;3(2):77–101.

    Article  Google Scholar 

  38. Liamputtong, Pranee. Rigour and ethics in qualitative research. In: Qualitative research methods. 3rd ed. South Melbourne, Vic.: Oxford University Press; 2009. p. 19–40.

  39. Ilic D, Bin Nordin R, Glasziou P, Tilson JK, Villanueva E. A randomised controlled trial of a blended learning education intervention for teaching evidence-based medicine. Bmc Med Educ Internet. 2015;15:39. Available from: ://WOS:000351034300001.

  40. Björnsdóttir I, Kristinsson KG, Hansen EH. Diagnosing infections: a qualitative view on prescription decisions in general practice over time. Pharm World Sci. 2010 Dec;32(6):805–14.

    Article  Google Scholar 

  41. Ajjawi R, Thistlethwaite JE, Aslani P, Cooling NB. What are the perceived learning needs of Australian general practice registrars for quality prescribing? BMC Med Educ. 2010 Dec;10(1):92.

    Article  Google Scholar 

  42. Welink LS, de Groot E, Pype P, Van Roy K, van den Wittenboer ID, Bartelink M-LEL, et al. GP trainees’ perceptions on learning EBM using conversations in the workplace: a video-stimulated interview study. BMC Med Educ Internet. 2020;20(1):139. [cited 2021 Jan 13]. Available from:

  43. Magin P, Tapley A, Morgan S, Davis JS, McElduff P, Yardley L, et al. Reducing early career general practitioners’ antibiotic prescribing for respiratory tract infections: a pragmatic prospective non-randomised controlled trial. Fam Pract. 2017/10/07 ed. 2018;35(1):53–60.

  44. Michie S, Johnston M, Abraham C, Lawton R, Parker D, Walker A, et al. Making psychological theory useful for implementing evidence based practice: a consensus approach. Qual Saf Health Care. 2005;14(1):26–33.

    Article  Google Scholar 

  45. Michie S, van Stralen MM, West R. The behaviour change wheel: a new method for characterising and designing behaviour change interventions. Implement Sci. 2011;6(1):42.

    Article  Google Scholar 

  46. Brtnikova M, Crane LA, Allison MA, Hurley LP, Beaty BL, Kempe A. A method for achieving high response rates in national surveys of U.S. primary care physicians. PLOS ONE. 2018;13(8):e0202755.

    Article  Google Scholar 

Download references


The research team would like to thank the GP registrars who participated in this study and shared their time and experience for the purpose of this research.


The project was conducted with funding from GP Synergy Ltd.

Author information

Authors and Affiliations



NZ, CS, MH and STL contributed to the conception and design of the project. HR-A collected qualitative data and conducted data analysis. CS, CT, MH, PM and STL contributed to the data analysis. CS provided study and research supervision. HR-A and CS drafted the manuscript and all authors contributed to the revision of it and providing feedback. The author(s) read and approved the final manuscript.

Corresponding author

Correspondence to Catherine Spooner.

Ethics declarations

Ethics approval and consent to participate

Ethical approval was obtained from the Human Research Ethics Committee at UNSW Sydney (Approval number HC16483), and all methods were performed in accordance with the relevant guidelines and regulations. All participants provided informed consent prior to participation in the study.

Consent for publication

Not applicable.

Competing interests

GP Synergy Ltd. contracted UNSW, Sydney to develop the online course that is the subject of this paper. Specifically, CS, MH, CWMT and NZ were involved in the development of the modules. PM is employed by GP Synergy Ltd. No authors have a financial interest in the online course.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1 Table S1

Topics, lessons and learning objectives of the six modules. Fig. S1 Examples of the course screens. Table S2 Response to an open-ended question on what participants liked most about the course (n = 160). Table S3 Responses to an open-ended question on what participants would like to see changed in the course (n = 98).

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Rahimi-Ardabili, H., Spooner, C., Harris, M.F. et al. Online training in evidence-based medicine and research methods for GP registrars: a mixed-methods evaluation of engagement and impact. BMC Med Educ 21, 492 (2021).

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: