Skip to main content
  • Research article
  • Open access
  • Published:

Power of the policy: how the announcement of high-stakes clinical examination altered OSCE implementation at institutional level

Abstract

Background

The Objective Structured Clinical Examination (OSCE) has been widely applied as a high-stakes examination for assessing physicians’ clinical competency. In 1992, OSCE was first introduced in Taiwan, and the authorities announced that passing the OSCE would be a prerequisite for step-2 medical licensure examination in 2013. This study aimed to investigate the impacts of the announced national OSCE policy on implementation of OSCE at the institutional level. Further, the readiness and the recognition of barriers toward a high-stakes examination were explored.

Methods

In 2007 and 2010, the year before and after the announcement of high-stakes OSCE policy in 2008, respectively, questionnaires on the status of OSCE implementation were distributed to all hospitals with active OSCE programs in Taiwan. Information on OSCE facilities, equipment, station length, number of administrations per year, and the recognition of barriers to the success of implementing an OSCE were collected. The missing data were completed by telephone interviews. The OSCE format, administration, and facilities before and after the announcement of the nationwide OSCE policy were compared.

Results

The data were collected from 17 hospitals in 2007 and 21 in 2010. Comparing the OSCE formats between 2007 and 2010, the number of stations increased and the station length decreased. The designated space and the equipment for OSCE were also found to have been improved. As for the awareness of OSCE implementation barriers, the hospital representatives concerned mostly about the availability and quality of standardized patients in 2007, as well as space and facilities in 2010.

Conclusions

The results of this study underscored an overall increase in the number of OSCE hospitals and changes in facilities and formats. While recruitment and training of standardized patients were the major concerns before the official disclosure of the policy, space and facilities became the focus of attention after the announcement. The study results highlighted the influence of government policy on different aspects of OSCE implementation in Taiwanese training institutes that showed high level of support as reflected in the improved hardware and the change in OSCE format to serve the summative purpose.

Peer Review reports

Background

Since the first attempt of implementing the Objective Structured Clinical Examination (OSCE) in the University of Dundee of the United Kingdom in 1975 [1], it has been accepted as an important assessment tool for evaluating the clinical skills of medical students and healthcare professionals. The first national licensure OSCE was adopted by the Medical Council of Canada (MCC) in 1992 [2], and later by many other countries, such as the United States [3] and Korea [4, 5].

The OSCE was initially introduced in Taiwan in early 1990s [6] and soon accepted as a kind of formative assessment in several medical schools and teaching hospitals [7]. Due to an increased focus on physicians’ clinical competencies, in 2008, the authorities in Taiwan announced that passing the OSCE would be a prerequisite for taking the written test of clinical science medical licensure examination (i.e. step 2) in 2013. The national examination will be a 12-station audiovisual recorded OSCE held in multiple test sites. To ensure the success of the high-stakes OSCE in 2013, two large-scale pilot OSCEs were performed nationwide in 2011 and 2012.

Previous studies on OSCE mainly focused on experience-sharing [8–10], the reliability in terms of psychometrics [11–13], and review of the validity of OSCE after its implementation [14, 15]. On the other hand, the implementation of high-stakes OSCE is a formidable challenge to all medical educators, especially those in countries where OSCE is still in its infancy in the medical education system. This study, therefore, aimed at investigating the impact of the announcement of high-stakes OSCE on its preparation in terms of hardware and software, format, and awareness at major clinical educational institutes and medical centers in Taiwan, thereby shedding light on the socio-political influences on the assessment of clinical competencies in medical education and the consequences.

Methods

There were 130 and 129 institutes qualified as teaching hospitals in 2007 and 2010, respectively (i.e. the year before and after the announcement of the initiation of national high-stakes OSCE in 2008). Purposive sampling was performed through distributing questionnaires on the status of implementation of OSCEs to all of the teaching hospitals with active OSCE programs during the two respective time periods, the list of which had been acquired through internet surfing and telephone confirmation. All questionnaires were filled in by the personnel responsible for OSCE administration who were supposed to thoroughly understand all details about OSCE implementation at their own institutes. Data missing from incomplete questionnaires were then collected through telephone. The information providers were not required to identify themselves or provide personal opinions. The questions covered information on OSCE facilities, equipment, station length, and the frequency of administrations each year.

To investigate the recognized barrier to OSCE success at an institutional level, another part of the questionnaire for each year covered information on the key components of OSCE implementation, including standardized patient, raters, data analysis, space and equipment, case writing, and administration. A 5-point Likert-type scale was given to each item: 1 - Remarkable difficulty hampering OSCE implementation; 2 - Major obstacle requiring time-consuming efforts to overcome; 3 - Minor difficulty not significantly hindering OSCE implementation; 4 - Fairly smooth implementation; 5 - Implementation with notable success. Recognized barrier to success of implementing OSCE was defined as an item to which the representative of the institute gave a score of 1 or 2.

The data acquired before and after the announcement of the nationwide OSCE were regarded as coming from a distinct group in each year and examined for differences by chi-square test, but if the corresponding expected value was less than 5 in any one cell of the related 2x2 table, Fisher’s exact test was used instead to obtain the p-value. A p-value <0.05 is considered statistically significant.

The questionnaire was designed to collect data at the institutional level on an anonymous basis without involving expression of personal opinions. The whole study was reviewed by the Institutional Review Board (IRB) of Clinical Research Ethics of E-Da Hospital which granted exemption from IRB review for the present study.

Results

Of the 17 and 23 hospitals with active OSCE programs in 2007 and 2010, respectively, the questionnaire response rates were 100% (17/17) in 2007 and 92.3% (21/23) in 2010. All the medical school-affiliated teaching hospitals had been included in this survey because of their hosting OSCE for either training or assessing the clinical competency of medical students in both years.

Implementation format

There was no significant fluctuation in the proportion of hospitals that ran OSCEs at least 4 times a year between 2007 and 2010 (64.7% in 2007 and 61.9% in 2010). On the other hand, six hospitals (35.3%) had more than 9 stations for each OSCE session in 2007, while the number of hospitals with 9 stations or more increased to 13 (61.9%) in 2010. As for the station length, 40% of the hospitals had regular-length station of 4-11 min in 2007, whereas the number of hospitals increased to 14 (66.7%) in 2010. Despite the apparent changes in the implementing format of the OSCE between 2007 and 2010 as reflected in the increase in the number of stations for each OSCE session and the preferred adoption of regular-length station instead of long station, they failed to reach statistical significance (Table 1).

Table 1 Comparison of the differences of formats and facilities in high-stakes OSCE in Taiwan between 2007 and 2010

Space and facilities

Compared with those in 2007, the facilities and equipment for OSCE in the surveyed institutes were found to have improved in 2010. In 2007, only 9 (52.9%) hospitals had built standard test rooms for running OSCE programs, and that number increased to 16 (76.2%) in 2010. The increase, however, was not statistically significant (Table 1). On the other hand, in 2007, while only 5 hospitals (29.4%) were equipped with audiovisual recording systems in all the test rooms, that number tripled to 15 (71.4%) in 2010, showing significant improvement (p = 0.01) in hardware installation between the two years (Table 1).

Awareness of barriers toward excellence

The awareness of implementation barriers toward excellence also showed some differences between these two years. In 2007, the most important concerns of the hospitals were recruitment and training of standardized patients (SPs) (7 hospitals, 41.2%), while the main issues became space and facilities (10 hospitals, 47.6%), followed by the preparation of raters (7 hospitals, 33.3%), and case writing ability (6 hospitals, 28.6%) in 2010. The aggravated concern regarding the need for space and facilities was statistically significant (p = 0.02), whereas the other changes in items of awareness were insignificant between the two years (Table 2).

Table 2 The awareness of barriers encountered in high-stakes OSCE in Taiwan between 2007 and 2010

Discussion

A high-stakes educational assessment procedure, such as a national OSCE, and the related official requirements may influence different aspects of the routine implementation at clinical education institutes that used to utilize OSCE as a formative educational tool. In Taiwan, there had been little change in the test format of OSCE during the initial 15 years, from its debut in 1992 to the survey in 2007. This study showed that the official announcement of high-stakes OSCE as an upcoming national examination played a catalytic role in facilitating OSCE standardization at institutional level.

The essential features of an OSCE of high quality include a well-planned blueprint with multi-faceted tasks and multi-stations and well-trained SPs and raters [16]. It has been reported that an increase in the length of OSCE, a careful selection and an increase in the number of stations (>10), as well as a combination of OSCE and other less resource-intensive methods, such as a written test, can increase the reliability of OSCE scores [17, 18].

In 2007, 17 clinical training institutes claimed to be able to run OSCEs regularly for medical students. With the emphasis on assessment for learning, the OSCEs were characterized by a small number of long stations. Due to facility and space limitation, examinations were performed mainly in clinical areas with only a limited number of stations being held in standard test rooms. Beside facilities, other essential components that constitute an OSCE for summative purpose were also missing.

Immediately after the official announcement of OSCE as part of the medical licensure examination by the Ministry of Education in Taiwan, preparation for the high-stakes assessment tool was started at different medical education institutes. Within three years from 2007 to 2010, the number of training hospitals capable of running regular OSCEs for medical students increased by 41.2% (from 17 to 23) with a concomitant rise in the number of stations and curtailing of station length. Besides, the number of institutes with designated test rooms for OSCE increased from 9 (52.9%) in 2007 to 16 (76.2%) in 2010. Furthermore, the number of hospitals with audiovisual equipment was substantially elevated from 5 (29.4%) to 15 (71.4%) (p = 0.01), highlighting an overall increase in the number of institutes with active OSCE programs and significant improvement mainly in facilities of test rooms following a decision of augmenting financial investment in compliance with the criteria of accreditation for national test centers of high-stakes OSCE.

Although the number of stations for each track, the duration of each station, and the use of standardized test rooms tended to follow the summative format, there was no significant difference in the number of institutes that ran OSCE at a high frequency (i.e. ≧4). Indeed, the number of such institutes was higher in 2010 than in 2007. The findings imply a stage of preparation in response to the policy through retaining the high frequency of examination not only to familiarize the trainees with the format of high-stakes OSCE as well as to provide experience for potential raters and SPs, but also to accumulate experience at an institutional level. Taken together, all the major clinical educational institutes in Taiwan showed willingness to improve the OSCE quality toward a high-stakes level, and to offer OSCE practices for their trainees.

Differences were also noted between the two years in the recognition of difficulties encountered during OSCE implementation. In 2007, the recruitment and training of SPs were the major concerns, while space and facilities, as well as case writing were the major weaknesses perceived in 2010. The differences may be explained by the formative nature of OSCE in 2007 with strong emphasis being placed on communication skills, physical examination, and professional behaviors. Besides, SPs were assigned the duty of providing quality feedbacks for candidates in scenarios which were commonly complex and embedded with emotions. SPs were asked to develop fine performance skills and to be well articulate. Since SPs were the key determinants of a successful learning experience, training SPs was highlighted as a difficult task. On the other hand, after the announcement of the OSCE policy in 2010, competing for accreditation became a major concern. However, spaces were mainly utilized for clinical service in most clinical training centers without adequate room designated for educational assessment that requires sophisticated audiovisual recording system. Space and facilities, therefore, became the most important concerns for the education institutes. Moreover, the establishment of the bank of case scenarios for high-stakes OSCE required high-quality contributions from faculty members of each institute that imposed another challenge to the training centers (Table 2).

The limitation of this study is the difficulty in completely excluding the impact of time as a potential confounder in the study observations rather than the policy announcement itself. However, since little change had been observed after the implementation of OSCE during the past 15 years before 2007, it is rational to believe that the improvement by leaps and bounds in facilities between 2007 and 2010 could be attributed to the announcement of the high-stakes OSCE policy.

Conclusions

The results of this study underscored an overall increase in the number of hospitals capable of running OSCE and significant improvement in facilities after announcement of the high-stakes OSCE policy. Some of the formats including number of stations and station length were also changed despite the lack of statistical significance. While recruitment and training of SPs were the major concerns before the official disclosure of the policy, space and facilities gained most attention after the announcement. The results of this study indicate a high level of support from the medical training institutes in Taiwan toward the implementation of the upcoming high-stakes OSCE as reflected in the significant improvement in hardware, the change in OSCE format to the summative direction, and the high frequency of OSCE training at the medical education institutes to familiarize themselves with the high-stakes format. The recognized difficulties described in this survey may serve as a reference for medical educators toward implementation of high-stakes OSCE.

Abbreviations

OSCE:

Objective structured clinical examination

SPs:

Standardized patients.

References

  1. Harden RM, Stevenson M, Downie WW, Wilson GM: Assessment of clinical competence using objective structured examination. Br Med J. 1975, 1 (5955): 447-451. 10.1136/bmj.1.5955.447.

    Article  Google Scholar 

  2. Reznick RK, Blackmore D, Dauphinee WD, Rothman AI, Smee S: Large-scale high-stakes testing with an OSCE: report from the Medical Council of Canada. Acad Med. 1996, 71 (1 Suppl): S19-21.

    Article  Google Scholar 

  3. Whelan GP, Boulet JR, McKinley DW, Norcini JJ, van Zanten M, Hambleton RK, Burdick WP, Peitzman SJ: Scoring standardized patient examinations: lessons learned from the development and administration of the ECFMG Clinical Skills Assessment (CSA®). Medical teacher. 2005, 27 (3): 200-206. 10.1080/01421590500126296.

    Article  Google Scholar 

  4. Kim KS: Introduction and administration of the clinical skill test of the medical licensing examination, republic of Korea (2009). J Educ Eval Health Prof. 2010, 7: 4.

    Article  Google Scholar 

  5. Lee Y-S: Osce for the Medical Licensing Examination in Korea. Kaohsiung J Med Sci. 2008, 24 (12): 646-650. 10.1016/S1607-551X(09)70030-0.

    Article  Google Scholar 

  6. Lee K-T, Liu W-T, Yen J-H, Liu C-K, Liu K-M, Lai C-S: The Experience of an Objective, Structured Clinical Examination at Kaohsiung Medical University. Kaohsiung J Med Sci. 2008, 24 (12): 624-626. 10.1016/S1607-551X(09)70026-9.

    Article  Google Scholar 

  7. Liu M, Huang YS, Liu KM: Assessing core clinical competencies required of medical graduates in Taiwan. Kaohsiung J Med Sci. 2006, 22 (10): 475-483. 10.1016/S1607-551X(09)70341-9.

    Article  Google Scholar 

  8. Sibert L, Mairesse J-P, Aulanier S, Olombel P, Becret F, Thiberville J, Peron J-M, Doucet J, Weber J: Introducing the objective structured clinical examination to a general practice residency programme: results of a French pilot study. Medical teacher. 2001, 23 (4): 383-388. 10.1080/01421590120031048.

    Article  Google Scholar 

  9. Davis MH: OSCE: the Dundee experience. Medical teacher. 2003, 25 (3): 255-261. 10.1080/0142159031000100292.

    Article  Google Scholar 

  10. Vargas AL, Boulet JR, Errichetti A, Zanten M, LĂłpez MJ, Reta AM: Developing performance-based medical school assessment programs in resource-limited environments. Medical teacher. 2007, 29 (2-3): 192-198. 10.1080/01421590701316514.

    Article  Google Scholar 

  11. Roberts J, Norman G: Reliability and learning from the objective structured clinical examination. Med Educ. 1990, 24 (3): 219-223. 10.1111/j.1365-2923.1990.tb00004.x.

    Article  Google Scholar 

  12. Newble D: Techniques for measuring clinical competence: objective structured clinical examinations. Medical Education. 2004, 38 (2): 199-203. 10.1111/j.1365-2923.2004.01755.x.

    Article  Google Scholar 

  13. Roberts C, Newble D, Jolly B, Reed M, Hampton K: Assuring the quality of high-stakes undergraduate assessments of clinical competence. Medical teacher. 2006, 28 (6): 535-543. 10.1080/01421590600711187.

    Article  Google Scholar 

  14. Hodges B: Validity and the OSCE. Medical teacher. 2003, 25 (3): 250-254. 10.1080/01421590310001002836.

    Article  Google Scholar 

  15. Vallevand A, Violato C: A predictive and construct validity study of a high-stakes objective clinical examination for assessing the clinical competence of International Medical Graduates. Teach Learn Med. 2012, 24 (2): 168-176. 10.1080/10401334.2012.664988.

    Article  Google Scholar 

  16. Norcini J, Boulet J: Methodological issues in the use of standardized patients for assessment. Teach Learn Med. 2003, 15 (4): 293-297. 10.1207/S15328015TLM1504_12.

    Article  Google Scholar 

  17. Newble DI, Swanson DB: Psychometric characteristics of the objective structured clinical examination. Medical Education. 1988, 22 (4): 325-334. 10.1111/j.1365-2923.1988.tb00761.x.

    Article  Google Scholar 

  18. Brannick MT, Erol-Korkmaz HT, Prewett M: A systematic review of the reliability of objective structured clinical examination scores. Medical Education. 2011, 45 (12): 1181-1189. 10.1111/j.1365-2923.2011.04075.x.

    Article  Google Scholar 

Pre-publication history

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tsuen-Chiuan Tsai.

Additional information

Competing interests

The authors declare that they have no competing interests.

Authors’ contributions

CWL designed the study, participated in the data acquisition and analysis, and drafted the manuscript. TCT participated in the study design, interpreted the data, and revised the manuscript critically. CKS participated in analysis and interpretation of data, and drafted the manuscript. DFC participated in the data acquisition and revised the manuscript critically. KML participated in the study design and coordination, and revised the manuscript critically. All authors read and approved the final manuscript.

Rights and permissions

Open Access This article is published under license to BioMed Central Ltd. This is an Open Access article is distributed under the terms of the Creative Commons Attribution License ( https://creativecommons.org/licenses/by/2.0 ), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Lin, CW., Tsai, TC., Sun, CK. et al. Power of the policy: how the announcement of high-stakes clinical examination altered OSCE implementation at institutional level. BMC Med Educ 13, 8 (2013). https://doi.org/10.1186/1472-6920-13-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/1472-6920-13-8

Keywords