Skip to main content
  • Research article
  • Open access
  • Published:

Student evaluation of an OSCE in paediatrics at the University of the West Indies, Jamaica

Abstract

Background

The Faculty of Medical Sciences, University of the West Indies first implemented the Objective Structured Clinical Examination (OSCE) in the final MB Examination in Medicine and Therapeutics during the 2000–2001 academic year. Simultaneously, the Child Health Department initiated faculty and student training, and instituted the OSCE as an assessment instrument during the Child Health (Paediatric) clerkship in year 5. The study set out to explore student acceptance of the OSCE as part of an evaluation of the Child Health clerkship.

Methods

A self-administered questionnaire was completed by successive groups of students immediately after the OSCE at the end of each clerkship rotation. Main outcome measures were student perception of examination attributes, which included the quality of instructions and organisation, the quality of performance, authenticity and transparency of the process, and usefulness of the OSCE as an assessment instrument compared to other formats.

Results

There was overwhelming acceptance of the OSCE in Child Health with respect to the comprehensiveness (90%), transparency (87%), fairness (70%) and authenticity of the required tasks (58–78%). However, students felt that it was a strong anxiety-producing experience. And concerns were expressed regarding the ambiguity of some questions and inadequacy of time for expected tasks.

Conclusion

Student feedback was invaluable in influencing faculty teaching, curriculum direction and appreciation of student opinion. Further psychometric evaluation will strengthen the development of the OSCE.

Peer Review reports

Background

The assessment of student's clinical competence is of paramount importance, and there are several means of evaluating student performance in medical examinations [1, 2]. The Objective Structured Clinical Examination (OSCE) is an approach to student assessment in which aspects of clinical competence are evaluated in a comprehensive, consistent and structured manner, with close attention to the objectivity of the process [3]. The OSCE was introduced by Harden in 1975 [4], and first described as an assessment format in Paediatrics (Child Health) by Waterson and colleagues [5]. Since its inception, the OSCE has been increasingly used to provide formative and summative assessment in various medical disciplines worldwide [6], including non-clinical disciplines [7].

The University of the West Indies was established in 1948 as a medical college of the University of London, which granted external degrees to those who successfully completed the course [8]. The Faculty of Medical Sciences located on four campuses, on the islands of Jamaica, Bahamas, Barbados and Trinidad and Tobago, conducts bi-annual final examinations at the end of year 5. The 'traditional' format of examination that included long case, short cases and oral examination, was preserved until recent changes in the curriculum. In response to recommendations to improve the validity and fairness of the examination through adoption of proven methods and approaches in assessment and evaluation in medical education, the Faculty of Medical Sciences (FMS), University of the West Indies (UWI) initiated the OSCE as a formal method of assessment for the final examination in Medicine and Therapeutics, Child Health, Community Health and Psychiatry, in November 2000. Students and faculty were exposed for the first time to a relatively new assessment instrument in which aspects of competence (communication, history-taking and technical skills) were assessed in a structured, formal manner.

The Section of Child Health, Mona, Jamaica, implemented the OSCE examination as an end-of clerkship assessment for students in their 5th year, during the 1999–2000 academic year. It was felt timely in order to (a) direct and motivate student learning in areas not previously assessed in the 'traditional' curriculum, (b) verify students' competence in fundamental paediatric clinical skills, and (c) provide a forum for feedback to students on their strengths and weaknesses in clinical skills. It was thought that it would enhance faculty and student acceptance of this new assessment tool and promote faculty training for the newly introduced final OSCE examination.

In the absence of any previous information from this institution, the study was designed to evaluate student overall perception of the end-of-clerkship OSCE, determine student acceptability of the process and provide feedback to enhance further development of the assessment.

Methods

The OSCE comprised a circuit of thirteen stations, which involved completion of a number of tasks such as examination of a system, eliciting a focussed history, counselling or communicating a problem, performing a procedure and problem-solving oriented around patient and laboratory data, and photographic material (Figure 1). The areas assessed included cardiovascular, respiratory, abdomen, neurological, developmental, dysmorphism and nutrition. This assessment format allowed the controlled exposure of students to a wide variety of paediatric clinical skills within a relatively short time period. Each station was 7 minutes duration with the exception of the 14-minute history-taking station. One minute was given between stations to facilitate change and the reading of instructions. With the inclusion of strategically placed rest stations, to reduce student and patient fatigue, all students completed the circuit over a 2-hour period.

Figure 1
figure 1

Plan of OSCE circuit

A standardised technique of marking was used and student performance was assessed by criterion reference for each station. Criterion-based scoring was used, with each checklist item scored as 0 (omitted, incorrect or inadequate), or 1–2 (correct or adequate).

Face and content validity of each checklist was established by review and consensus by a core group of senior paediatricians. Stations were first selected to represent the curricular goals and objectives and to reflect authentic clinical situations. Checklists were designed to include the features thought to be most important by the development committee. Through discussions, consensus was achieved on the checklist items and structure.

The study was conducted during the period July 2001 to December 2002. Five groups of students participated in the process, during their respective clerkship rotations. Student groups had at least two briefing sessions before the OSCE, and included an orientation about the examination process (both end-of-clerkship and final MB) and a review of commonly assessed competences. They were also apprised of the valuable contribution they could make towards improving the assessment and encouraged to participate in the evaluation.

A cross-sectional survey using a 32-item self-administered questionnaire was completed at the end of each OSCE [9]. Students were asked to evaluate the content, structure, and organization of the OSCE, rate the quality of performance and objectivity of the OSCE process, and to give their opinion about the usefulness of the OSCE as an assessment instrument compared to other forms which they had experienced (essays, multiple choice questions, long and short cases, general clerkship rating).

Participation was on a voluntary basis and students were assured that those who declined involvement in the survey would not be penalised. The Curricular Affairs Section handled the administration and analysis of the questionnaires. Ethical approval was received from the University Hospital of the West Indies/University of the West Indies Faculty of Medical Sciences Ethics Committee. Following completion of the questionnaire, an OSCE review session was conducted with the students for feedback and teaching purposes, at the end of the clerkship. Students were given the opportunity to review their individual performances at the respective stations. Examiner evaluations were also used in the feedback process.

Data were collated and descriptive and non-parametric tests applied using Stata version7 [10]. Basic statistical analysis of the Likert items was conducted by calculating frequencies, means and standard deviations. Qualitative analysis was done through a form of content analysis by identifying themes in student responses and grouping responses according to thematic content. Two of the authors individually conducted this content analysis and identified themes and final grouping of responses were developed by consensus.

Results

OSCE evaluation

Eighty-one students responded to the questionnaire, representing 92 % (81/88) of those who completed the Clerkship.

The majority of students agreed that the OSCE was comprehensive and covered a wide range of knowledge (95%) and clinical competencies (86%) in Child Health. Three quarters (78%) also agreed that the assessment process helped to identify weaknesses and gaps in their competencies (Table 1).

Table 1 OSCE evaluation

Most (73–82%) felt that the exam was well administered, and that the stations were arranged in an organised and well-sequenced order.

Students believed that the assessment was fair (68%). Fifty-three percent were aware of the level of information required at each station, yet 28% felt that the examination process minimized their chances of failing.

Students found the OSCE to be intimidating (48%) and more stressful (35%) than other assessment formats to which they were previously exposed. And most (70%) felt that they needed more time to complete the stations.

Performance testing

The majority of students felt they were well oriented about the exam and that the required tasks were consistent with the actual curriculum that they were taught. They also felt that the process was fair but were not as satisfied with the time allocation for each station (Table 2).

Table 2 Quality of performance testing

Most saw the OSCE as a useful learning experience and that the content reflected real life situations in Child Health. More than half of the students were satisfied with the conduct, organisation and administration of the OSCE.

Perception of validity and reliability

Although half of the students believed that the scores were standardised, they were unsure whether their scores were an actual reflection of their paediatric clinical skills (Table 3). Student responses to the question about bias due to gender, personality or ethnicity, were not interpretable.

Table 3 Student perception of validity and reliability

Comparing assessment formats

Students were asked to rate the following assessment instruments to which they had been exposed (multiple choice questions, essays / short answer questions, general clerkship ratings, OSCE). A likert scale was used to assess each according to the evaluative labels (Table 4).

Table 4 Student rating of assessment formats

Thirty-two percent of students felt that the clerkship rating was the easiest, while 48% rated MCQ as a more difficult form of assessment. The OSCE was overwhelmingly considered the fairest assessment format (80%), and essays (68%) to a lesser extent. OSCE (60%) and clerkship ratings (62%) were considered the most useful learning experiences. Compared to the other assessment formats, 52% considered that the OSCE should be used most in the clinical years.

Qualitative data

Students were asked follow-up questions related to positive and negative aspects of the OSCE and suggestions for improvement. The open-ended responses were grouped by thematic content.

Among the positive attributes of the OSCE, students re-affirmed that the assessment was comprehensive (44 comments) and that it was an objective and fair process (43 comments). Some indicated that the opportunity for feedback helped to motivate them and drive the learning process (21 comments).

Students felt that the time allocated to perform expected tasks was insufficient (36 comments), and that the procedure was stressful (18 comments) and tiring (13 comments). Technical problems (28 comments) included unclear instructions, inadequate time provision and instructions between stations and detention of some candidates at stations by examiners.

Suggestions for improvement included increasing the duration of stations (29 comments), ensuring clear instructions (8 comments) and having more realistic expectations of students for the expected tasks. A few students wished to have more training with the OSCE and suggested that the examination should be videotaped to increase objectivity and permit review.

Discussion

Students overwhelmingly perceived that the OSCE in Child Health had good construct validity. This was demonstrated by the favourable responses concerning transparency and fairness of the examination process, and the authenticity of the required tasks per station. Excellent levels of acceptance of the OSCE by students have been previously described in the literature [1114]. They however expressed concerns and uncertainty about whether the process would minimize their chances of failing or that the results were a true reflection of their clinical skills. This was understandable, since it was their first encounter with this type of assessment.

Several felt that the examination was stressful and intimidating, yet paradoxically some students perceived it as an enjoyable, practical experience. Studies surveying student attitudes during the OSCE have documented that the OSCE can be a strong anxiety-producing experience, and that the level of anxiety changes little as students progress through the examination [15].

It is well recognised that assessment is a catalyst for both curriculum change and student learning. The students recognised the value of the instrument for formative evaluation. In addition, as many medical schools have adopted a student-centred approach to medical education, greater student participation in quality assurance exercises must be encouraged. Students perceived the OSCE to be fairer than any other assessment format to which they were exposed. The findings were somewhat similar to the views of students at Newcastle medical school [16]. Although student views on fairness may not be consistent with published literature, the impact and influence on acceptability of the instrument should be noted.

They offered constructive criticism of the structure and organisation of the process. At some stations they felt that the instructions were ambiguous and that the time allocation was inadequate for the expected tasks. The feedback was invaluable and facilitated a critical review and modification of the station content and conduct of the examination over time. Faculty perceived that the concerns about time allocation per station and the degree of stress expressed by the students were due to inadequate preparation for the examination, particularly in competences not previously assessed in the 'traditional' examination.

The high student response rate has helped to ensure that the findings presented are a valid representation of student opinion. Students have traditionally viewed the end-of-clerkship assessment as a 'high-stake' examination and also perceive it as predictive of their performance at their final MB examination. Student perception of the OSCE however, may have been influenced by anxiety and lack of confidence associated with a new assessment. The responses may also have been affected by the timing of the inquiry (immediately after the examination); hence student stress and fatigue should be taken into consideration. Whereas the high response rate ensured that the views were reasonable representative of the students, differences in assessors could have influenced the interpretation of the results of open-ended responses.

Implementing the OSCE in Child Health at the University of the West Indies, Jamaica has been challenging, however student participation in the evaluation and their overall acceptance of the instrument have been encouraging. Feedback from students and faculty has been useful in effecting improvements to the process and greater emphasis has been placed on the teaching and evaluation of history taking, communication and technical competencies. It is also sending a clear message to students that the achievement of overall competence is imperative to clinical practice in the current environment. Ultimately, these provide the loop necessary to drive the continuum of curriculum development. This has been timely considering that the Faculty of Medical Sciences, Jamaica is undergoing significant reform [17]. Further developments involving psychometric evaluation will strengthen the process.

Conclusions

In summary, the findings highlight the need for student participation in the development of new assessment tools in medical curricula. Student acceptance will be more favourable for assessment formats that they perceive to be transparent, authentic and valid. 'Traditional' medical curricula must be responsive to global paradigm shifts in undergraduate medical education.

References

  1. Harden RM: How to assess clinical competence – an overview. Med Teach. 1979, 1: 289-296.

    Article  Google Scholar 

  2. Fowell SL, Bligh JG: Recent developments in assessing medical students. Postgrad Med J. 1998, 74: 18-24.

    Article  Google Scholar 

  3. Harden RM: What is an OSCE?. Med Teach. 1988, 10: 19-22.

    Article  Google Scholar 

  4. Harden RM, Stevenson M, Downie WW, Wilson GM: Assessment of clinical competence using objective structured examination. Br Med J. 1975, 1: 447-451.

    Article  Google Scholar 

  5. Waterson T, Cater JI, Mitchell RG: An objective undergraduate clinical examination in child health. Arch Dis Child. 1980, 55: 917-922.

    Article  Google Scholar 

  6. Carraccio C, Englander R: The objective structured clinical examination, a step in the direction of competency-based evaluation. Arch Pediatr Adolesc Med. 2000, 154: 736-741.

    Article  Google Scholar 

  7. Harden RM, Caincross RG: The assessment of practical skills: the Objective Structured Practical Examination (OSPE). Stud High Educ. 1980, 5: 187-196.

    Article  Google Scholar 

  8. Sherlock P, Nettleford R: The University of the West Indies: a Caribbean response to the challenge of change. 1990, Hong Kong: Macmillan Caribbean

    Google Scholar 

  9. De Lisle J: 2001 Phase 2, OSCE student evaluation form. 2001, Mount Hope, Trinidad The Centre for Medical Science Education, Faculty of Medical Sciences

    Google Scholar 

  10. StataCorp: Stata Statistical Software: Release 7.0. 2001, College Station, TX: StataCorp LP

    Google Scholar 

  11. Newble DI: Eight years experience with a structured clinical examination. Med Educ. 1988, 22: 200-204.

    Article  Google Scholar 

  12. Duerson MC, Romrell LJ, Stevens CB: Impacting faculty teaching and student performance: nine years' experience with the objective structured clinical examination. Teach Learn Med. 2000, 12: 176-182. 10.1207/S15328015TLM1204_3.

    Article  Google Scholar 

  13. Kowlowitz V, Hoole AJ, Sloane PD: Implementation of the Objective Structured Clinical Examination in a traditional medical school. Acad Med. 1991, 66: 345-347.

    Article  Google Scholar 

  14. Woodburn J, Sutcliffe N: The reliability, validity and evaluation of the objective structured clinical examination in podiatry. Assessment Evaluation Higher Educ. 1996, 21: 131-147.

    Article  Google Scholar 

  15. Allen R, Heard J, Savidge M, Bittengle J, Cantrell M, Huffmaster T: Surveying students' attitudes during the OSCE. Adv Health Sci Educ. 1998, 3: 197-206. 10.1023/A:1009796201104.

    Article  Google Scholar 

  16. Duffield KE, Spencer JA: A survey of medical students' views about the purposes and fairness of assessment. Med Educ. 2002, 36: 879-886. 10.1046/j.1365-2923.2002.01291.x.

    Article  Google Scholar 

  17. Faculty of Medical Sciences, University of the West Indies, Mona. Report – MB, BS Undergraduate Programme, First Annual Report on Curriculum Development. 2003, FMS, UWI, Mona, Jamaica

Pre-publication history

Download references

Acknowledgements

We wish to thank Dr. Jerome De Lisle of the Centre for Medical Science Education (CMSE), EWMSC, Trinidad, for professional advice and permission to use the questionnaire in this study. We also express our gratitude to the participating students and lecturers in Child Health who contributed to the implementation of the OSCE in the department.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Russell B Pierre.

Additional information

Competing interests

The authors declare that they have no competing interests.

Authors' contributions

RP conceptualised the study; developed the proposal, coordinated the conduct of the project, completed initial data entry and analysis, and wrote the report. AW participated in the design of the study, coordinated the conduct of the project, performed the statistical analysis, and assisted in writing the report. MB was the main organizer of the clerkship OSCE, and assisted in editing the final report. MBr and CC participated in overall supervision of project and revision of report. All authors read and approved the final manuscript.

Russell B Pierre, Andrea Wierenga contributed equally to this work.

Authors’ original submitted files for images

Below are the links to the authors’ original submitted files for images.

Authors’ original file for figure 1

Rights and permissions

Reprints and permissions

About this article

Cite this article

Pierre, R.B., Wierenga, A., Barton, M. et al. Student evaluation of an OSCE in paediatrics at the University of the West Indies, Jamaica. BMC Med Educ 4, 22 (2004). https://doi.org/10.1186/1472-6920-4-22

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/1472-6920-4-22

Keywords