Skip to main content

The implementation of a quality system in the Dutch GP specialty training: barriers and facilitators; a qualitative study



Quality assurance programs in medical education are introduced to gain insight into the quality of such programs and to trigger improvements. Although of utmost importance, research on the implementation of such programs is scarce. The Dutch General Practice (GP) specialty training institutes used an implementation strategy to implement a quality system (QS), and we aimed to study the success of this strategy and to learn about additional facilitators and barriers.


Seventeen structured interviews were conducted with the directors and quality coordinators (QCs) of the eight Dutch GP training institutes. A five-stage process model of implementation was used to structure these interviews and analyze the data. Two researchers analyzed the data with a framework approach.


The strategy supported the institutes in implementing the QS. However, after the introduction of the QS, staff experienced the QS as demanding, although they noticed almost no concrete short-term results. Moreover, they experienced difficulties in integrating the QS into their local situation. Collectively working with the QS and following common deadlines did create a sense of commitment towards each other that appeared to be a true stimulus to the introduction of the QS.


The implementation strategy focused mainly on the introduction of the QS in the GP specialty training, and it was, as such, rather successful. An important barrier concerned the acceptance of the QS and the integration of the QS into local structures, which suggests that there is a need for guidance on the translation of the QS to local contexts. All in all, we recommend more focus on the benefits of a QS.

Peer Review reports


Quality assurance and improvement in medical education are of paramount importance, not only for the benefit of medical students and future doctors, but above all for patients [1]. Quality assurance in education involves evaluating and improving activities and processes of both teaching and learning [2, 3], and there are differences in the way organizations manage and assess the quality of their education [4, 5]. To create more uniformity internationally and to stimulate improvement and assure minimum quality standards, the World Federation for Medical Education (WFME) has developed global standards [5, 6]. Organizations worldwide use this framework as a model, for example to establish national and regional accreditation objectives [5]. The framework covers all relevant aspects of basic medical education [7], post-graduate medical education [8] and continuing professional development [9].

Quality standards, such as those of the WFME, can serve as a basis for quality systems (QSs). Organizations use QSs to manage quality in a systematic manner. However, the actual implementation is of crucial importance to the success of a QS: innovation failures are often due to unsuccessful implementation rather than perceived ineffectiveness of the innovation [10]. Although many studies have described the implementation of innovations in health care [11], little is known about the implementation processes of QSs in the field of medical education. Because quality management in medical education is growing, there is a need for understanding the implementation process of such systems to identify the barriers to and enablers of improvement [12,13,14].

The implementation of a QS starts with the decision to adopt an intervention [15]. Based on theories that consider implementation to be a step-by-step process, Grol and Wensing (2011) proposed a five-stage model to support the design of an implementation strategy [16]. These stages are known as (1) orientation, (2) insight, (3) acceptance, (4) change, and (5) consolidating change. Ideally, the stages proceed as follows: in the first stage, the people involved receive information about the innovation and become aware of, and get interested in, the innovation, in this case: the new QS. The second stage will help them understand the benefits and get prepared, for example by following training to learn more about the innovation. In the third stage, people will develop a positive attitude, get motivated, and will want to get started: they accept the innovation. People move on to the fourth stage when they start to work with the innovation and experience its advantages. In the final stage, they integrate the innovation into their daily work.

The Dutch General Practice (GP) specialty training institutes (Fig. 1) developed and implemented a QS, named GEAR, to stimulate quality assurance, quality improvement and collaboration between institutes. Based on the wishes of the institutes, GEAR aimed to combine the virtues of two earlier initiatives in order to provide clear and shared assessment criteria. GEAR would also act as an advisory tool to stimulate the quality of the training [17]. A project team was commissioned by the GP specialty training to develop the new QS. This team consisted of four experts from GP care and quality care. They were advised by a sounding board, made up of a director of an institute, GP trainers, a trainee, an expert in quality management, and six representatives of professional associations. Additionally, the directors of the eight institutes were closely involved in the development of the new system [17].

Fig. 1
figure 1

Dutch GP Specialty Training

After listing possible barriers and facilitators, the project team developed an implementation strategy that included four components: (1) the involvement of the directors of training institutes in the development of the new QS (content and process); (2) a web-based, professional, supportive data-entry system including a comprehensive manual; (3) a training program; and (4) a national quality coordinator to support the institutes with improvement activities. Table 1 shows the implementation strategy used for the new QS and its intended effect on the different stages of the model of Grol and Wensing [16]. The aim of this paper is to gain insight into the effect of the strategy and to learn about the additional factors that affect the implementation of a QS in a postgraduate medical specialty training.

Table 1 Components of the implementation strategy used for the QS and their intended effects



The Dutch GP specialty training is a 3-year postgraduate training (Fig. 1). There was a need for one collective and structured system that would serve multiple purposes: quality assurance, quality improvement, and the enhancement of cooperation between the institutes. For this reason, the institutes jointly developed a new QS in 2011. The new QS encompassed self-evaluation, benchmarking among institutes, audits, exchange of good practices, and improvement plans; in addition, a quality coordinator was involved to stimulate the exchange [17] (Fig. 2).

Fig. 2
figure 2

GEAR figure. GEAR assesses the institutes in seven domains. The domains correspond with the WFME standards, but they have been adapted to the GP specialty training. All domains are assessed once every five years. Quantitative and qualitative assessment methods are used. The introduction of the system starts with self-evaluation and involves deadlines to ensure that all institutes take the different steps at the same time. Semi-annual meetings take place to exchange Good Practices. After the measurement round, institutes design and implement improvement plans


To explore the effects of the preparatory strategy, the directors of all GP specialty training institutes were interviewed, as well as one former director (who was involved in the development and the implementation of the system) and eight quality coordinators (QCs) (n = 17). The directors were involved in the development of the new QS and tasked with the expansion of the QS at the institutes. The QCs were tasked with coordinating the collection of data at the institutes and were trained to work with the (web-based) QS. Participants were informed that participation was voluntary and that the transcribed interviews would be coded to prevent responses being traceable to individual participants or departments. All participants agreed with this and gave written informed consent.

Data collection

Semi-structured face-to-face interviews were conducted individually at the interviewee’s training institute from May to August 2013, after the first audit round. One researcher (NB) conducted all the interviews. They were audio-recorded and transcribed verbatim.


The interviews were conducted using a semi-structured questionnaire (Table 2). The questionnaire was pilot-tested on two members of one of the GP specialty training institute, which led to a few adjustments. The resulting questionnaire included a topic list that consisted of six categories: (1) general questions, (2) system specific questions, (3) questions about the implementation process, (4) questions about the results, (5) evaluation questions, and (6) questions about the support. These six categories were divided into sub-categories, which included keywords (for example: provision of information, relevance, acceptance, skills) that referred to Grol and Wensing’s five stage model (1) orientation, (2) insight, (3) acceptance, (4) change, and (5) consolidating change) [16]. To get an overall picture, questions about the experiences of staff in the organization were added. In the interview with the QCs, two more questions were added about how they had been informed and instructed.

Table 2 Structure of interviews

Data analysis

A ‘framework’ approach was used to analyze the data [18]. Upon completion of all interviews, the relevant interview excerpts were identified (NB, SvR) and divided into categories that referred to the theoretical model of implementation [16]. Outcomes were discussed until consensus was reached. Subsequently, the relevant text fragments were electronically coded using MAXqda software to further refine the analysis by grouping similar fragments together and selecting key categories within each stage. The data was also scanned for patterns of similarity or dissimilarity across and within departments and for differences between directors and QCs within the departments.


We interviewed eight directors (four male and four female), one former director (male) and eight QCs (three male and five female). On average, the interviews lasted 80 min, ranging from 45 to 110 min.

Five-stage model

This section describes the findings from the interviews grouped per stage of the model of Grol and Wensing [16]. Table 3 gives an overview of the strategies, barriers, and illustrative quotes.

Table 3 Strategies, barriers and quotes in each stage


All participants confirmed that they were aware that the QS would be introduced, either because there had been a clear announcement or because they had been involved in the development of the QS, or both. The information meeting in which the system was introduced to the staff was evaluated positively, and some participants said that the meeting had motivated them to use the QS. One QC, however, was not motivated by the information meeting: this QC considered a common QS to be of little use, given the differences between the institutes. The QC would have preferred a separate QS for each individual institute, albeit with national support.

The perceptions concerning the relevance of the QS differed. Some participants mentioned that the QS could be a tool for mutual benefit, that it would enhance uniformity between institutes, and that it would show to the outside world that the GP training institutes were quality-driven. Other participants were more focused on internal aspects. For them, the QS helped to focus on quality in a structured way and served as a mirror to gain insight into processes within their own institute.

Although the institutes prepared for the start of the QS, not all institutes felt ready for it. Some participants mentioned that their institute was not able to invest the time and resources they expected would be necessary. Participants said that both they and their staff members therefore felt resistance to the QS. For some of the institutes, the information meeting proved to be helpful in taking away this resistance: “One of the developers of the QS came and spoke. She explained and illustrated the system to all the staff, after which the mood became more positive. It was a good decision to introduce the system in this way.” (QC5).In sum: the strategies proved helpful for some institutes to motivate staff and reduce resistance. During the orientation stage, we also observed two main barriers: (1) not all the participants saw the relevance of the QS, and (2) not all institutes felt ready to implement the new QS.


All QCs received professional training, and participants confirmed they gained a better understanding of the QS because of the training. They evaluated the training positively, partly because it offered them an opportunity to meet the QCs of other institutes which created a sense of togetherness. During this training, the QCs were given a comprehensive and informative manual to prepare themselves and others in the institutes, which they said was helpful.

Participants mentioned that the QS was an elegant and well-designed system, that the manual was informative and looked professional, and that the support was provided by a team of professional and experienced advisers. They also indicated that the goal, the meaning and global processes of the QS were fully clarified to them. Still, participants sometimes had to deal with uncertainties (Table 3: “quotes”). This was the case among the directors more often than among the QCs. Additionally, most participants were apprehensive about the system’s benefits for their respective institutes. Some of the participants revealed that they did not explicitly need the QS for receiving insight into strengths and weaknesses of their own institute because, for example, they already had some kind of quality system. Some directors had negative or neutral attitudes towards the importance of the QS. Nevertheless, the new QS was experienced, by and large, as much better than the previous systems.

The training and the professional system (component 2 and 3 of the implementation strategy) helped the participants to work with the QS: they gained a better understanding. However, there were still uncertainties, and the need for a shared QS for their own institute was not always evident. The barriers we observed were: (1) participants had a passive attitude with low expectations, (2) not everything was clear, and (3) some participants did not see the new QS as an improvement to the prior situation.


Most participants agreed on the goals and assumptions, and on the premise that the GP specialty training institutes, as professional organizations, had to have a collective system. They also accepted the framework of the system, but they did not accept the realization of the system in its entirety. The main criticism on the QS was that it seemed to assess preconditions for quality, instead of quality itself. The participants also doubted the credibility of the benchmarking and the feasibility of exchanging good practices; in addition, they doubted the audit because it was a snapshot of the institute and, therefore not reliable, according to some participants.

Some participants mentioned that the system did not suit their institute: “Our institute works with signals coming from the staff; however, the system works top down.” (QC7) In contrast to most directors, who decided to introduce the new QS and were involved in the development of the QS, QCs experienced the system as being imposed on them: “Is it a system we introduce because we want to improve ourselves? Or is it a system that is imposed centrally” (QC3). Therefore, not all QCs accepted the system; they looked at it as something with which they had to comply.

The strategies could not prevent the following bottlenecks at this stage: (1) participants had doubts about the credibility of the QS, (2) the approach did not fit every institute, and (3) the QCs felt the QS was imposed on them. These factors made it difficult for the system to be accepted in its entirety.


The introduction of the system involved many deadlines, to ensure that all institutes would take the different steps at the same time. While the QCs were prepared to work with the QS and felt supported, most also experienced time pressure to meet the deadlines and mentioned the many tasks that had to be performed. QCs felt there was not much room to deviate from the schedule and stated that it was a stressful period. Most directors experienced this more positively.

The participants experienced talking and thinking about quality both nationwide and at individual institutes as positive and stimulating. The results of the QS that participants mentioned were the following: introducing the system provided an opportunity to discuss quality, to look at the organization from a wider perspective, and to tighten the local policy. Notwithstanding these benefits, most participants mentioned that their institute invested more than they benefited from the system. They also stated that concrete results were lacking: “The audit hasn’t brought us new insights, although the investments were very large; we expected to benefit more from it.” (D3).

The barriers in the fourth stage that we found were: (1) a lack of flexibility, (2) time pressure — especially for the QCs, and (3) that the participants felt they had made considerable investments while the institutes had, so far, experienced a lack of new insights and concrete results.

Consolidating change

After the firm deadlines, attention for the QS faded. At some institutes, the director and the QC had a clear and shared vision of the position of the system in the organization. This was helpful for keeping the system alive and for integrating the QS. However, most participants agreed that it was hard to integrate the QS into the organization. The national QC (component 4 of the implementation strategy) could play an important role in supporting the institutes with developing and implementing the improvement plans after the audit. Although participants mentioned they had faith and high expectations of the QC, so far she had not been actively involved.


This study aimed to gain more insight into the effect of the implementation strategy of a QS in a postgraduate medical specialty training. The four main components of the implementation strategy were (1) involvement of the directors of training institutes in the development of the whole system (content and process), (2) a web-based professional supportive system including a comprehensive manual, (3) a coordinated training programme, and (4) a national quality coordinator to support the institutes.

The results indicate that the implementation strategy was successful in preparing the institutes, helping the participants understand the potential benefits of the QS, completing necessary data collection in time, and creating a sense of togetherness in this process. Introducing common deadlines for data collection for all institutes enhanced peer pressure, and the participants indicated they found it stimulating to do this collectively. Our results, therefore, confirm previous findings that peer pressure and a sense of togetherness can contribute to an effective implementation: working together can enhance confidence and motivation, and prevent isolation [12, 15, 19].

Accepting the QS after the introduction, however, appeared to be difficult. This might be due to two factors: the perceived credibility of the new QS and the way the QS suited the local situation of the participating institutes. During the development of this system [17], we already observed that stakeholders doubted that the system was appropriate for measuring quality, and in this study we again observed that the participants were not convinced of this point. However, the literature reports that the people involved in a change-process have to feel the innovation is needed and appropriate [12, 16]. Consequently, we suggest that paying attention to the appropriateness and benefits of a QS for individual institutes and local contexts is important for the introduction of a shared QS.

QS acceptation may also have been difficult because quality coordinators (QCs) felt that the QS was imposed on them and that their investment in the system outweighed the benefits. We suggest it might be advisable to involve QCs and other staff - not just staff representatives - in the development of a QS. This may help build a broader base of support, which is likely to positively affect the acceptation of the QS. The literature confirms that a lack of ownership among staff is one of the biggest challenges in implementing an innovation [12]. Previous studies have also shown that the lack of concrete results is often a reason for an unsuccessful implementation [16, 20]. Our results suggest that the participants experienced the investments in the QS as much larger than the benefits. However, quality improvement has been shown to take between 5 and 10 years to achieve breakthroughs in continuous improvements in organization cultures [21]. Therefore, it can be helpful to be transparent about anticipated absences of short-term effects so that staff members can adjust their expectations.

Institutes agreed that integrating the QS was difficult. It seems that they needed more support in using the QS at their own institute. The chosen strategies, however, did not address this integration of the QS with local activities, cultures, and structures. More attention to the translation of the QS to practice therefore seems advisable. The literature also recognizes the importance of the translation of innovation to practice [22] and emphasizes the importance of the adaptability of an intervention to local circumstances. It appears to be difficult to keep the balance between interventions and local needs [15]. We suggest that institutes could benefit from the QS more optimally if they discuss what they need before launch in order to develop an action/implementation plan. An implementation plan is helpful in managing the process [21].

Our study showed some limitations and strengths. One strength was the use of the theoretical model/framework [16]. This was helpful for detecting facilitators and barriers in each of the different stages. A limitation was that we focused only on the directors and the QCs of training institutes. We did ask them about the perspectives of other staff, but we did not approach other staff directly. A second limitation is that we collected data early on in the process, which provides only limited insights into the last stage of the implementation process. However, had we delayed the data collection, people might have forgotten their early experiences with the system, and these experiences played an important role in the implementation process.


In summary, this study shows the complexities of implementating a joint QS in the eight postgraduate medical specialty training institutes of the Netherlands, and it reveals several barriers to a successful implementation of such a QS. Practice points distilled from this study can be find in Table 4. More research on the implementation of a QS might be important as more knowledge of the effects of QSs can help convince staff and enhance the acceptance of a QS [12, 16]. Our predetermined implementation strategy focused on the preparation phase, and it gave little attention to the executing phase. The barriers we found mostly concerned the executing phase, in particular the connection with the local context. More focus on the context in which the institutes operate might have helped the integration of the QS at the separate institutes, and convergence with the local context may also enhance the sustainability of the QS.

Table 4 Practice points





Dutch acronym for Combined Evaluation Audit Round


General Practitioner


Quality Coordinator


Quality System


World Federation for Medical Education


  1. Da Dalt L, et al. A model of quality assurance and quality improvement for post-graduate medical education in Europe. Med Teach. 2010;32(2):e57–64.

    Article  Google Scholar 

  2. Grant D, Mergen E, Widrick S. A comparative analysis of quality management in US and international universities. Total Qual Manag Bus Excell. 2004;15(4):423–38.

    Article  Google Scholar 

  3. Kleijnen J, et al. Does internal quality management contribute to more control or to improvement of higher education?: a survey on faculty’s perceptions. Qual Assur Educ. 2011;19(2):141–55.

    Article  Google Scholar 

  4. van Zanten M, et al. Overview of accreditation of undergraduate medical education programmes worldwide. Med Educ. 2008;42(9):930–7.

    Article  Google Scholar 

  5. Karle H. Global standards and accreditation in medical education: a view from the WFME. Acad Med. 2006;81(12 Suppl):S43–8.

    Article  Google Scholar 

  6. van Niekerk JP. WFME global standards receive ringing endorsement. Med Educ. 2003;37(7):585–6.

    Article  Google Scholar 

  7. WFME. Basic Medical Education WFME Global Standards for Quality Improvement. 2015 [cited 2016 12 May] ; Available from:

  8. WFME. Postgraduate Medical Education WFME global standards for quality improvement 2015 [cited 2016 12 May]; Available from:

  9. WFME. Continuing professional development (CPD) of medical doctors WFME global standards for quality improvement. 2015 [cited 2016 12 May]; Available from:

  10. Klein KJ, Sorra JS. The challenge of innovation implementation. Acad Manag Rev. 1996;21(4):1055–80.

    Google Scholar 

  11. Grimshaw J, et al. Toward evidence-based quality improvement. Evidence (and its limitations) of the effectiveness of guideline dissemination and implementation strategies 1966–1998. J Gen Intern Med. 2006;21(Suppl 2):S14–20.

    Google Scholar 

  12. Dixon-Woods M, McNicol S, Martin G. Ten challenges in improving quality in healthcare: lessons from the Health Foundation’s programme evaluations and relevant literature. BMJ Qual Saf. 2012;21(10):876–84.

    Article  Google Scholar 

  13. Kaplan H, et al. The influence of context on quality improvement success in health care a systematic review of the literature. Milbank Q. 2010;88(500):59.

    Google Scholar 

  14. Lomas J. Using research to inform healthcare managers’and policy makers’questions: from summative to interpretive synthesis. Healthc Policy. 2005;1(55):71.

    Google Scholar 

  15. Damschroder LJ, et al. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4(50):1–15.

  16. Grol R, Wensing M. Implementatie: effectieve verbetering van de patiëntenzorg. Amsterdam: Reed Business; 2011.

    Google Scholar 

  17. Buwalda N, et al. Developing a collective quality system: challenges and lessons learned. BMC Med Educ. In press 2017.

  18. Pope C, Ziebland S, Mays N. Qualitative research in health care. Analysing qualitative data. BMJ. 2000;320(7227):114–6.

    Article  Google Scholar 

  19. Rafferty AE, Jimmieson NL, Armenakis AA. Change readiness: a multilevel review. J Manag. 2012;39(1):110–35.

    Google Scholar 

  20. Seymour D. TQM on campus: what the pioneers are finding. AAHE Bull. 1991;44:10–8.

    Google Scholar 

  21. Leebov W, Jean C, Ersozs CJ. The health care manager’s guide to continuous quality improvement. Lincoln: iUniverse; 2003.

    Google Scholar 

  22. Fokkema JP. Innovating the practice of medical speciality training. Perspect Med Educ. 2016;5(1):48–50.

    Article  Google Scholar 

Download references


The authors are sincerely grateful to Prof. Dr. Wieringa-de Waard, one of the developers of GEAR, who provided expertise and insights that greatly assisted the research, and to Simon Muskitta for the design of the GEAR figure. Furthermore, we would like to thank all Dutch GP training institutes for their hospitality and their co-operation.


Stichting BeroepsOpleiding Huisartsen (SBOH).

Availability of data and materials

The datasets generated and/or analyzed during the current study are not publicly available due to individual privacy issues, but they are available from the corresponding author on reasonable request.

Author information

Authors and Affiliations



The first author (NB) gathered and analyzed the data and wrote the manuscript. The second author (JB) supervised the analysis and edited the manuscript. The third author (SvR) analyzed data and edited the manuscript. The fourth author (NvD) supervised the project and edited the manuscript. The fifth author (MV) supervised the analysis and the project and edited the manuscript. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Nienke Buwalda.

Ethics declarations

Ethics approval and consent to participate

Participants gave informed consent. No further ethical approval was needed according to the Dutch law on medical scientific research (WMO; Wet Medisch-wetenschappelijk Onderzoek).

Consent for publication

Not applicable.

Competing interests

Dr. M.R.M Visser works at the Dutch General Practitioner (GP) specialty training institute as a quality coordinator. In addition, she is member of the national GEAR committee that is tasked with further developing and updating the GEAR system.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (, which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Buwalda, N., Braspenning, J., van Roosmalen, S. et al. The implementation of a quality system in the Dutch GP specialty training: barriers and facilitators; a qualitative study. BMC Med Educ 17, 127 (2017).

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: