Skip to main content

The effect of implementing undergraduate competency-based medical education on students’ knowledge acquisition, clinical performance and perceived preparedness for practice: a comparative study

Abstract

Background

Little is known about the gains and losses associated with the implementation of undergraduate competency-based medical education. Therefore, we compared knowledge acquisition, clinical performance and perceived preparedness for practice of students from a competency-based active learning (CBAL) curriculum and a prior active learning (AL) curriculum.

Methods

We included two cohorts of both the AL curriculum (n = 453) and the CBAL curriculum (n = 372). Knowledge acquisition was determined by benchmarking each cohort on 24 interuniversity progress tests against parallel cohorts of two other medical schools. Differences in knowledge acquisition were determined comparing the number of times CBAL and AL cohorts scored significantly higher or lower on progress tests. Clinical performance was operationalized as students’ mean clerkship grade. Perceived preparedness for practice was assessed using a survey.

Results

The CBAL cohorts demonstrated relatively lower knowledge acquisition than the AL cohorts during the first study years, but not at the end of their studies. We found no significant differences in clinical performance. Concerning perceived preparedness for practice we found no significant differences except that students from the CBAL curriculum felt better prepared for ‘putting a patient problem in a broad context of political, sociological, cultural and economic factors’ than students from the AL curriculum.

Conclusions

Our data do not support the assumption that competency-based education results in graduates who are better prepared for medical practice. More research is needed before we can draw generalizable conclusions on the potential of undergraduate competency-based medical education.

Peer Review reports

Background

In response to societal concerns about the role of doctors in contemporary healthcare, competency-based medical education is receiving increasing attention worldwide [19]. Its underlying assumption is that competency-based medical education results in doctors who are better prepared for medical practice [10]. In Canada and the United States, the national accreditation councils have implemented competency-based criteria for postgraduate medical education [1, 11]. Additionally, a competency framework has been proposed and guidelines have been developed for undergraduate competency-based medical education [5, 12, 13]. In the European Union, as part of the Bologna process, all medical schools are required to base their undergraduate curricula on a clear and well-defined set of competencies [14]. A major focus of competency-based curricula is to facilitate students’ development of competencies, demonstrable abilities consisting of knowledge, skills and professional behaviour. Consequently, when implementing competency-based medical education, curriculum time has to be reserved for students’ competency development [2, 15]. This means there will be less time available for existing activities of preceding curricula. Therefore, such a reallocation of time may not only result in the facilitation of competency development but may also impair students’ development in other areas. To our knowledge, the gains and losses associated with implementing undergraduate competency-based curricula are still unknown. Therefore, we examined undergraduate medical students’ knowledge acquisition, clinical performance and perceived preparedness for medical practice for two curricula – a competency-based active learning (CBAL) curriculum and its predecessor, a regular active learning (AL) curriculum.

Undergraduate medical curricula usually have a set duration. When implementing competency-based education, curriculum time has to be reserved so students can develop their competencies. The time reserved for activities aimed at competency development will usually come at the expense of time previously reserved for knowledge acquisition [15]. This reallocation of time may negatively affect students’ knowledge acquisition in a competency-based curriculum. Although medical students’ knowledge has not been found to be an immediate predictor of clinical performance, it does impact clinical performance indirectly [16]. To allow for well-informed decisions about curriculum innovations, it should be clear whether implementing undergraduate competency-based medical education leads to knowledge loss among medical students.

One of the key forces behind competency-based medical education is the public call for medical curricula to reflect the needs of contemporary medical practice [1, 15, 17, 18]. Therefore, competency frameworks comprehensively reflect what a competent doctor should be able to demonstrate in practice, [19] and should benefit students’ preparation for medical practice. Throughout competency-based curricula, relevant competencies and their relation with practice are continuously emphasized which helps students to understand what is expected of them during medical training and in medical practice [3, 12]. Consequently, students should feel better prepared for practice which, in turn, is a prerequisite for self-efficacy – the extent to which a person believes that he or she can successfully fulfil a specific task in a specific context. Self-efficacy is of key importance for developing competence and autonomy in practice [20, 21]. We expected students from a CBAL curriculum to feel better prepared for medical practice and to perform better during clerkships than students from an AL curriculum, where the presence of an underlying competency framework is less explicit.

The possible gains and losses associated with the implementation of an undergraduate competency-based medical curriculum provide valuable information for future curriculum development and add to the theory of competency-based medical education. Therefore, we examined the influence of implementing a competency-based curriculum on medical students’ knowledge acquisition, clinical performance and perceived preparedness for medical practice.

Methods

Context

The AL and the CBAL curriculum were developed and implemented at the University of Groningen, The Netherlands. Characteristics of both curricula are presented in Table 1. The CBAL curriculum was implemented in September 2003 and focuses on seven areas of competence: communication, clinical problem-solving, using basic knowledge and science, patient investigation, patient management, social and community contexts of health care and reflection [22].

Table 1 Characteristics of the Active Learning and Competency-Based Active Learning curriculum at the UMCG

In both curricula, active learning principles are applied to facilitate knowledge acquisition. Students learn in small groups, collaborate with their peers and engage in self-directed learning. Teachers and tutors fulfil a coaching and facilitating role [23].

Learning methods and the amount of time reserved for skills training are similar in both curricula. However, in the AL curriculum skills training is divided over smaller courses throughout the preclinical phase, whereas skills training in the CBAL curriculum is concentrated in the first year of the clinical phase. During this year, five-week periods of skills training in the clinical training centre are alternated with five-week clerkship rotations. The purpose of this alternation is to ease the transition from the preclinical to the clinical phase by helping students develop their skills, just in time, to apply them in practice and to further integrate them with knowledge and professional behaviour [24].

The main difference between the two curricula lies in the emphasis on competency development. In the CBAL curriculum, the link between the purpose of each course and relevant competencies are clearly communicated throughout the course. This is not the case in the AL curriculum. Furthermore, 15% of the total CBAL curriculum time is reserved specifically for small group sessions aimed at competency development. Time for these sessions is created by diminishing the number of small group sessions originally aimed at knowledge acquisition in the AL curriculum. The total curriculum time remains the same.

Throughout the preclinical phase of the CBAL curriculum, small group sessions for competency development are based on students’ experiences in practice and assignments related to each area of competence. An example of such an assignment is that first-year students, unfamiliar with medical practice, have to describe the qualities of a good doctor. In their third study year the students have to repeat this assignment, and reflect on what they have learnt and experienced in the meantime. Other assignments are related to activities in intramural or extramural practice – for example an internship in a nursing home or consecutive interviews with a chronically ill patient. The competency development sessions are facilitated by a senior faculty member and are scheduled six to eight times a year. Additionally, students have to collect their assignments in a portfolio, on which they receive feedback bi-annually.

During the clinical phase, sessions aimed at competency development are scheduled 24 times a year. During these sessions students discuss their own experiences and certain themes in relation to their development (for example cultural diversity or dealing with death). In addition to assignments related to these meetings, students have to keep track of a personal development plan in their portfolio in which they formulate learning goals based on the areas of competence. During the clinical phase the portfolio is evaluated twice a year in an interview with a senior staff member.

The curriculum time reserved for clerkships is 80 weeks in both curricula. Students in the CBAL curriculum rotate through fewer disciplines than students in the AL curriculum. In the AL curriculum, clerkship duration varies between one and eight weeks and students rotate through 22 disciplines. When designing the CBAL curriculum we felt that the aim of clerkships shifted from experiencing as many disciplines as possible towards a balance between diversity and the stability of surroundings to support students’ competency development. Consequently, in the CBAL curriculum, the minimum duration for clerkship rotations was extended to 4 weeks to allow sufficient time for students to work on their competencies. Consequently, the number of clerkship rotations was reduced to 15. Furthermore, the last clerkship rotation entailed a clinical elective of which the duration was increased from 13 weeks in the AL curriculum to 20 weeks in the CBAL curriculum.

Participants

Undergraduate medical education in The Netherlands lasts 6 years. We included students who graduated within 7 years from the start of the last 2 cohorts of the AL curriculum (2001/2002 and 2002/2003; N = 453) and the first 2 cohorts of the CBAL curriculum (2003/2004 and 2004/2005; N = 372).

Ethical statement

Data were gathered during the time that, under Dutch law, educational studies were exempt from Institutional Board Review. At that time, no ethical review board for medical educational research existed in the Netherlands. However, data gathering was carried out in accordance with established ethical standards and the Declaration of Helsinki [2527]. The privacy policy of the University of Groningen states that student records can be used for research purposes, as long as reports cannot be traced back to individual students [28]. In accordance with this privacy policy, anonimyzed data were derived from the university administration.

Instruments

Knowledge acquisition was assessed by benchmarking our cohorts’ scores on the Dutch interuniversity progress test (IPT) against those of parallel cohorts from two other Dutch medical schools with similar cohort sizes (approximately 250 students per cohort). All cohorts sat the IPT four times per year at the same time, i.e. 24 tests per cohort. The IPT is based on the Dutch National Blueprint for the Medical Curriculum, and is designed to asses “the end objectives of undergraduate medical training as far as knowledge is concerned” [29, 30]. Each progress test contains 200 multiple choice questions and is constructed to reflect the entire domain of medical knowledge. The IPT is not related to the curriculum of one particular institution [30]. The reason for benchmarking against two other medical schools was that all students sat exactly the same tests at the same point in their education. IPT benchmarking is especially suitable for analysing effects of curriculum changes because, at the time of our study, admittance to medical schools in the Netherlands was still primarily determined by a national lottery system [31]. This system guarantees an intake of first-year students which is very similar across medical schools with regard to past performance, age, gender and motivation to study medicine [32]. Over the period of our study the medical schools used for comparison had not changed their curricula.

Clinical performance was operationalized as students’ average clerkship grade. In both curricula clinical assessment was identical: each clerkship grade was based on several mini-CEX scores. Mini-CEX scores are sufficiently reliable to estimate clinical competence [33]. In both curricula, grades were given on a 10-point scale.

To measure perceived preparedness for medical practice, we used data from an internal quality control survey, measuring how prepared students feel in each area of competence. Perceived preparedness was measured for 33 competencies (Table 2), using a 5-point scale (1 = ‘insufficiently prepared’, 5 = ‘excellently prepared’). Our medical school considers a mean score between 4 and 5 as excellently prepared, between 3 and 4 as well-prepared and below 3 as insufficiently prepared.

Table 2 Means, standard deviations and t-statistics for perceived preparedness of graduates from two curricula

Analysis

To analyse students’ knowledge acquisition, we used a method based on the first steps in the longitudinal benchmarking methods described by Muijtjens et al. [34]. We compared our students’ average score to those of the students from the other medical schools, using t-tests. The 24 means were plotted in a graph for each cohort. When our students scored significantly higher or lower, a ↑ or ↓ was drawn in the graph, respectively. A Bonferroni correction was used to compensate for the high number of tests and effect sizes were calculated.

We compared clinical performance and perceived preparedness for medical practice in the CBAL and AL curriculum using independent sample t-tests. With regard to perceived preparedness for medical practice, we first calculated the internal consistency of the scales using Cronbach’s α. Subsequently, curricula were compared on the mean scores for both items and scales using an α of 0.01 and effect sizes were calculated.

Results

Knowledge acquisition

The AL cohorts scored significantly higher on 10 (2001–2002; ES 0.30–0.57) and 14 progress tests (2002–2003; ES 0.27–0.66) and significantly lower on 1 progress test (2001–2002; ES 0.31) than cohorts from the other two medical schools. The CBAL cohorts scored significantly higher on 2 progress tests (2003–2004; ES 0.30 and 0.34) and significantly lower on 2 (2003–2004; ES 0.24 and 0.27) and 4 progress tests (2004–2005; ES 0.23–0.44) than cohorts from the other two medical schools (Figure 1). None of the 4 cohorts scored significantly different on the last three tests of the final year.

Figure 1
figure 1

Mean progress test scores of UMCG cohorts compared to those from two other medical schools. Mean scores (Y-axis) of the UMCG (solid line) cohorts from the AL curriculum (2001/2002 and 2002/2003) and the CBAL curriculum (2003/2004 and 2004/2005) compared to the combined mean scores of the cohorts from two other medical schools (dashed line) on 24 progress tests (X-axis). A downwards arrow (↓) or an upwards arrow (↑) marks the UMCG scoring significantly lower or higher than the other two schools, respectively.

Clinical performance

We did not find a significant difference between the clinical performance of students from the CBAL curriculum (Mean = 7.91; SD = 0.28) and the AL curriculum (Mean = 7.87; SD = 0.35; t(823) = −1.540; p = 0.124).

Perceived preparedness for medical practice

Of the CBAL and AL curriculum, 177 (48%) and 172 students (46%) completed the survey, respectively. Respondents and non-respondents were similar in gender distribution (74% and 70% female respondents, respectively) and mean clinical performance (Mean = 7.89; SD = 0.29 and Mean = 7.88; SD = 0.33, respectively). The internal consistency of the scales ranged from 0.70 to 0.86 (Table 2). Graduates from the CBAL curriculum felt excellently prepared for 10 and well prepared for 23 competencies. Graduates from the AL curriculum felt excellently prepared for 11 and well prepared for 22 competencies. Students from both curricula felt best prepared to treat a patient with respect and confidentiality (MeanAL = 4.51; MeanCBAL = 4.62) and felt worst prepared for following relevant legal regulations (MeanAL = 3.33; MeanCBAL = 3.37). At scale level, students felt excellently prepared for communication and well prepared in the other areas of competence. We found no significant differences at scale level. At item level, students from the CBAL curriculum felt better prepared for putting a patient problem in a broad context of political, sociological, cultural and economic factors (t(347) = −2.90; p = 0.004; ES = 0.31).

Discussion

The aim of our study was to analyse the effects of the implementation of a competency-based active learning curriculum (CBAL) as compared to the previous active learning curriculum (AL). Using progress test results, we found relatively less knowledge acquisition in the first years of the CBAL curriculum than in the first years of the AL curriculum. However, we did not find such difference in the final year. Graduates who had been trained in a CBAL curriculum did not score higher on clinical performance nor did they feel better prepared for medical practice.

Implementing competency-based education requires that curriculum time is reserved for activities that facilitate competency development. As more time is allocated to the development of competencies, less time will be devoted to other curricular activities. In undergraduate curricula these activities usually involve knowledge acquisition. As a consequence, implementing a CBAL curriculum bears the risk of knowledge loss. We analysed students’ knowledge acquisition by comparing the scores of CBAL and AL cohorts on 48 progress tests to those of parallel cohorts from two other medical schools, which had not changed their curriculum during the time of our study. Our assumption was that if our students’ relative position remained unchanged, there would have been no knowledge loss. In comparison to the cohorts of the other medical schools, our AL cohorts scored significantly higher on 50% of the progress tests (24 out of 48), whereas our CBAL cohorts scored significantly higher on only 4% of the tests (2 out of 48). However, at the end of undergraduate education the CBAL and the AL cohorts demonstrated similar knowledge acquisition. The effect sizes of the differences were small to medium. As we interpret the outcomes concerning the progress tests as trends per cohort rather than results per test, we feel the effect sizes are large enough to conclude that students in the AL curriculum show higher knowledge acquisition than the students in the CBAL curriculum in the first years of their undergraduate education. Reserving time for competency development at the expense of time reserved for knowledge acquisition, seems to lead to lower knowledge acquisition in the short term, but not in the long term.

Throughout the medical curriculum, knowledge plays an important part in expertise development [16, 35, 36]. As the CBAL cohorts seldom scored lower than the comparison cohorts and no long-term differences were found, we consider a permanent negative impact of implementing competency-based education on student learning and expertise development unlikely. An explanation for this finding might be that the clinical environment encourages students to regulate their own learning [37]. During clerkships students are repeatedly stimulated to remedy deficiencies in medical knowledge. Undergraduate students’ prior knowledge deficiencies appear to be overcome during their clerkships.

We expected CBAL students to perform better in clinical practice than AL students. However, we did not find a significant difference, which may indicate that implementation of competency-based education has no effect on clinical performance. A possible explanation for this finding may be that all students must be competent to work with real patients at the start of their clerkships, which restricts differentiation among students [38]. This homogeneity among clerks may explain why our clerks were mainly scored at the high end of the scale by their supervisors. Thus, we may have found no difference between the CBAL and the AL curriculum due to a restriction of range, caused by the requirements for entering the clinical phase.

We expected the CBAL students to feel better prepared for medical practice. To analyze students’ perceived preparedness we used survey data collected at graduation. The only difference we found between the two curricula is related to one of the core aims of competency-based medical education. Students from the CBAL curriculum felt better prepared to put a patient problem in a broad context of political, sociological, cultural and economic factors, which is in line with the aim to educate medical professionals who are sufficiently responsive to societal needs [1, 15, 17, 18]. It is also in line with the focus of competency-based medical education on the development of professionals in a societal context [2, 3, 5, 12, 19]. However, we were unable to demonstrate any other effects of the implementation of competency-based education on students’ perceived preparedness.

The fact that we did not find a general increase in student’s perceived preparedness for medical practice may be related to the educational tools we implemented to facilitate competency development: portfolio use and explicit communication of competencies and their underlying framework. A recent study by Sargeant et al. revealed that explicit communication of competencies and the use of portfolios help students to achieve informed self-assessment [39]. Students in the CBAL curriculum are frequently informed of what is expected of them and they are explicitly stimulated to reflect on their performance, to remedy their deficiencies and to formulate points of improvement. The awareness that follows from these activities may help students to become increasingly conscious of their deficiencies. Possibly, CBAL students were more aware of their competencies and incompetencies than AL students, which is an important step in the development of competence [40]. Consequently, the CBAL students may have underestimated their preparedness for practice as compared to AL students. Further research is needed to analyse the influence of implementing a CBAL curriculum on students’ reflectiveness and, subsequently, on their self-assessment.

A possible limitation of our study is that it is a single-site study, which affects the generalizability of our results. However, comparing curricula from the same institution has the advantage that most variables can be controlled. When the CBAL curriculum was introduced, teaching staff and learning methods remained largely unchanged. Consequently, our data have been gathered in the same context which increases the likelihood that possible effects can be attributed to the implementation of the CBAL curriculum. However, more studies are needed before generalizable conclusions can be drawn. Furthermore, our measurement of perceived preparedness had a limited response of 47%. However, the respondents and non-respondents were similar in gender distribution and clinical performance, which suggests that the sample was representative of the overall population.

Another limitation of our study might be that the measures we used – knowledge acquisition, clinical performance and perceived preparedness – are not specific to competency-based education. One could argue that for studying the effectiveness of competency-based education, measures are needed that fit conceptually. In our curricula, clinical competence was mainly assessed using global judgements. For research purposes, specific judgements may do more justice to the complexity of competencies. However, in this study such information was not available.

Finally, our study was limited to measurements during the course of undergraduate medical training and at graduation. Possibly, effects of competency-based education will become more apparent after graduation, in actual practice. Further research is needed to determine the long-term effects of implementing competency-based education at the undergraduate level. Despite the limitations of our study, we consider our outcome measures relevant because of their relation to performance in actual medical practice [16, 21]. Irrespective of the curriculum, medical graduates are expected to have sufficient knowledge and skills to practice professionally. Therefore, our study yields valuable information on the effect of implementing undergraduate competency-based education.

Conclusion

Implementing competency-based education in our undergraduate medical curriculum neither resulted in clerks who scored higher on clinical performance nor in graduates who felt better prepared for practice at the end of their training. Our study shows that there is some knowledge loss in the first study years of a CBAL curriculum as compared to the previous curriculum. Our study does not support the assumption that competency-based curricula result in graduates who are better prepared for medical practice. However, since this is one of the first studies in the field, it is too early to draw generalizable conclusions. More research is needed before we can conclude whether or not competency-based education meets the high expectations associated with its widespread implementation.

Authors’ information

Wouter Kerdijk, Msc, is a Psychologist and Researcher in Medical Education at the Center for Research and Innovation in Medical Education, University of Groningen and University Medical Center Groningen, Groningen, The Netherlands.

Jos W. Snoek, MD, PhD, neurologist, is professor in Clinical Education and Director of the Master of Medical Science Program at the Institute for Medical Education, University of Groningen and University Medical Center Groningen, Groningen, The Netherlands.

Elisabeth A. Van Hell, PhD, is an Educationalist at the Institute for Medical Education, University of Groningen and University Medical Center Groningen, Groningen, The Netherlands.

Janke Cohen-Schotanus, PhD, is professor in Research in Medical Education and Head of the Center for Research and Innovation in Medical Education, University of Groningen and University Medical Center Groningen, Groningen, The Netherlands.

References

  1. Frank JR, Danoff D: The CanMEDS initiative: implementing an outcomes-based framework of physician competencies. Med Teach. 2007, 29: 642-647. 10.1080/01421590701746983.

    Article  Google Scholar 

  2. Frank JR, Snell LS, Ten Cate O, Holmboe ES, Carraccio C, Swing SR, Harris P, Glasgow NJ, Campbell C, Dath D, Harden RM, Iobst W, Long DM, Mungroo R, Richardson DL, Sherbino J, Silver I, Taber S, Talbot M, Harris KA: Competency-based medical education: theory to practice. Med Teach. 2010, 32: 638-645. 10.3109/0142159X.2010.501190.

    Article  Google Scholar 

  3. Harden RM: Outcome-based education: the future is today. Med Teach. 2007, 29: 625-629. 10.1080/01421590701729930.

    Article  Google Scholar 

  4. Simpson JG, Furnace J, Crosby J, Cumming AD, Evans PA, David MF, Harden RM, Lloyd D, McKenzie H, McLachlan JC, McPhate GF, Percy-Robb I, MacPherson SG: The Scottish doctor–learning outcomes for the medical undergraduate in Scotland: a foundation for competent and reflective practitioners. Med Teach. 2002, 24: 136-143. 10.1080/01421590220120713.

    Article  Google Scholar 

  5. Smith SR, Dollase R: AMEE guide No. 14: outcome-based education: Part 2 - planning, implementing and evaluating a competency-based curriculum. Med Teach. 1999, 21: 15-22. 10.1080/01421599979978.

    Article  Google Scholar 

  6. Moon Y: Education reform and competency-based education. Asia Pac Educ Rev. 2007, 8 (2): 337-341. 10.1007/BF03029267.

    Article  Google Scholar 

  7. Albanese MA, Mejicano G, Anderson WM, Gruppen L: Building a competency-based curriculum: the agony and the ecstasy. Adv Health Sci Educ. 2010, 15: 439-454. 10.1007/s10459-008-9118-2.

    Article  Google Scholar 

  8. Ten Cate O: Medical education in the Netherlands. Med Teach. 2007, 29: 752-757. 10.1080/01421590701724741.

    Article  Google Scholar 

  9. Scheele F, Teunissen P, Luijk SV, Heineman E, Fluit L, Mulder H, Meininger A, Wijnen-Meijer M, Glas G, Sluiter H, Hummel T: Introducing competency-based postgraduate medical education in the Netherlands. Med Teach. 2008, 30: 248-253. 10.1080/01421590801993022.

    Article  Google Scholar 

  10. Frank JR: The CanMEDS 2005 Physician Competency Framework. Better Standards. Better Physicians. Better Care. 2005, Royal College of Physicians and Surgeons of Canada: Ottawa, ON

    Google Scholar 

  11. Accreditation Council for Graduate Medical Education (ACGME): Common Program Requirements: General Competencies. http://www.acgme.org/acgmeweb/tabid/83/ProgramandInstitutionalGuidelines.aspx.

  12. Albanese MA, Mejicano G, Mullan P, Kokotailo P, Gruppen L: Defining characteristics of educational competencies. Med Educ. 2008, 42 (3): 248-255. 10.1111/j.1365-2923.2007.02996.x.

    Article  Google Scholar 

  13. The Medical School Objectives Writing Group: Learning objectives for medical student education - Guidelines for medical schools: report I of the medical school objectives project. Acad Med. 1999, 74: 13-18.

    Article  Google Scholar 

  14. González J, Wagenaar R: Tuning Educational Structures in Europe II: Universities’ Contribution to the Bologna Process – phase 2. 2005, Bilbao & Groningen: University of Deusto & University of Groningen

    Google Scholar 

  15. Carraccio C, Wolfsthal SD, Englander R, Ferentz K, Martin C: Shifting paradigms: from Flexner to competencies. Acad Med. 2002, 77: 361-367. 10.1097/00001888-200205000-00003.

    Article  Google Scholar 

  16. Norman GR, Tugwell P, Feightner JW, Muzzin LJ, Jacoby LL: Knowledge and clinical problem-solving. Med Educ. 1985, 19: 344-356. 10.1111/j.1365-2923.1985.tb01336.x.

    Article  Google Scholar 

  17. Murray TJ: Medical education and society. CMAJ. 1995, 153: 1433-1436.

    Google Scholar 

  18. Lee AG: The new competencies and their impact on resident training in ophthalmology. Surv Ophthalmol. 2003, 48: 651-662. 10.1016/j.survophthal.2003.08.009.

    Article  Google Scholar 

  19. Frank JR, Mungroo R, Ahmad Y, Wang M, De Rossi S, Horsley T: Toward a definition of competency-based education in medicine: a systematic review of published definitions. Med Teach. 2010, 32: 631-637. 10.3109/0142159X.2010.500898.

    Article  Google Scholar 

  20. Bandura A: Social Foundations of Thought and Action: a Social Cognitive Theory. 1986, Englewood Cliffs, NJ: Lawrence Erlbaum

    Google Scholar 

  21. Mann KV: Thinking about learning: implications for principle-based professional education. J Contin Educ Health Prof. 2002, 22: 69-76. 10.1002/chp.1340220202.

    Article  Google Scholar 

  22. Heineman MJ, Schuling J, Wiersma H, Briët JW, Cohen-Schotanus J, Hiemstra RJ, Karg A, Kroese FGM, Post D, Snoek JW, Strating WJ: Blueprint G2010: Revised Medical Curriculum: Competencies, Assessment, Differentiation. 2004, Groningen: Institute for Medical Education, University of Groningen

    Google Scholar 

  23. Van Rossum HJM, Cohen-Schotanus J, Hulstaert C, Mantingh A, Poppema S, Roerdink F, Zwierstra R: Curriculum 2000: The Patient-Oriented Curriculum at the University of Groningen. 2000, Groningen: University of Groningen, Faculty of Medical Sciences

    Google Scholar 

  24. Van Hell EA, Kuks JBM, Borleffs JCC, Cohen-Schotanus J: Alternating skills training and clerkships to ease the transition from preclinical to clinical training. Med Teach. 2011, 33: e689-e696. 10.3109/0142159X.2011.611837.

    Article  Google Scholar 

  25. Ten Cate O: Why the ethics of medical education research differs from that of medical research. Med Educ. 2009, 43: 608-610. 10.1111/j.1365-2923.2009.03385.x.

    Article  Google Scholar 

  26. Eva KW: Research ethics requirements for medical education. Med Educ. 2009, 43: 194-195. 10.1111/j.1365-2923.2008.03285.x.

    Article  Google Scholar 

  27. World Medical Association (WMA): World Medical Association declaration of Helsinki: Ethical principles for medical research involving human subjects. [http://www.wma.net/en/30publications/10policies/b3/17c.pdf]

  28. University of Groningen: (privacy policy for students and personnel). regeling bescherming persoonsgegevens studenten en personeel. [http://www.rug.nl/bureau/expertisecentra/abjz/producten/pdf/regelingBeschermingPersoonsgegevens.pdf]

  29. Van Herwaarden CLA, Laan RFJM, Leunissen RRM: The 2009 Framework for Undergraduate Medical Education in the Netherlands. 2009, Utrecht: Dutch Federation of University Medical Centres

    Google Scholar 

  30. Van der Vleuten CPM, Schuwirth LWT, Muijtjens AMM, Thoben AJNM, Cohen-Schotanus J, Van Boven CPA: Cross institutional collaboration in assessment: a case on progress testing. Med Teach. 2004, 26: 719-725. 10.1080/01421590400016464.

    Article  Google Scholar 

  31. Schuwirth L, Bosman G, Henning RH, Rinkel R, Wenink ACG: Collaboration on progress testing in medical schools in the Netherlands. Med Teach. 2010, 32: 476-479. 10.3109/0142159X.2010.485658.

    Article  Google Scholar 

  32. Schmidt HG, Cohen-Schotanus J, Arends LR: Impact of problem-based, active learning on graduation rates for 10 generations of Dutch medical students. Med Educ. 2009, 43: 211-218. 10.1111/j.1365-2923.2008.03287.x.

    Article  Google Scholar 

  33. Wimmers PF, Schmidt HG, Splinter TAW: Influence of clerkship experiences on clinical competence. Med Educ. 2006, 40: 450-458. 10.1111/j.1365-2929.2006.02447.x.

    Article  Google Scholar 

  34. Muijtjens AMM, Schuwirth LWT, Cohen-Schotanus J, Thoben AJNM, van der Vleuten CPM: Benchmarking by cross-institutional comparison of student achievement in a progress test. Med Educ. 2008, 42 (1): 82-88.

    Article  Google Scholar 

  35. Schmidt HG, Norman GR, Boshuizen HPA: A cognitive perspective on medical expertise: theory and implications. Acad Med. 1990, 65: 611-621. 10.1097/00001888-199010000-00001.

    Article  Google Scholar 

  36. Neville AJ, Norman GR: PBL in the undergraduate MD Program at McMaster University: three iterations in three decades. Acad Med. 2007, 82: 370-374. 10.1097/ACM.0b013e318033385d.

    Article  Google Scholar 

  37. Gordon J, Hazlett C, ten Cate O, Mann K, Kilminster S, Prince K, O’Driscoll E, Snell L, Newble D: Strategic planning in medical education: enhancing the learning environment for students in clinical settings. Med Educ. 2000, 34: 841-850. 10.1046/j.1365-2923.2000.00759.x.

    Article  Google Scholar 

  38. Van Hell EA, Kuks JBM, Schönrock-Adema J, van Lohuizen MT, Cohen-Schotanus J: Transition to clinical training: influence of pre-clinical knowledge and skills, and consequences for clinical performance. Med Educ. 2008, 42: 830-837. 10.1111/j.1365-2923.2008.03106.x.

    Article  Google Scholar 

  39. Sargeant J, Eva KW, Armson H, Chesluk B, Dornan T, Holmboe E, Lockyer JM, Loney E, Mann KV, van der Vleuten CPM: Features of assessment learners use to make informed self-assessments of clinical performance. Med Educ. 2011, 45: 636-647. 10.1111/j.1365-2923.2010.03888.x.

    Article  Google Scholar 

  40. Schwenk TL, Whitman N: The physician as Teacher. 1987, Baltimore: Williams & Wilkins

    Google Scholar 

Pre-publication history

Download references

Acknowledgements

We are grateful to Mrs. J. Bouwkamp-Timmer for her critical en constructive comments on several drafts of the manuscript and editorial help.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Wouter Kerdijk.

Additional information

Competing interests

The authors declare that they have no competing interests.

Authors’ contributions

All authors were involved in the conception and design of this study. WK gathered and analysed the data. All authors interpreted the data together and were involved in drafting and revising the manuscript. All approved the final manuscript.

Authors’ original submitted files for images

Below are the links to the authors’ original submitted files for images.

Authors’ original file for figure 1

Rights and permissions

This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Kerdijk, W., Snoek, J.W., van Hell, E.A. et al. The effect of implementing undergraduate competency-based medical education on students’ knowledge acquisition, clinical performance and perceived preparedness for practice: a comparative study. BMC Med Educ 13, 76 (2013). https://doi.org/10.1186/1472-6920-13-76

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/1472-6920-13-76

Keywords