Skip to main content

Assessment of clinical competencies using clinical images and videos“CIVA”

Abstract

Background

This paper describes an assessment approach of clinical competencies whichwidens the number of problems and tasks evaluated using videos andimages.

Method

Clinical Image and Video Assessment (CIVA) was used to assess clinicalreasoning and decision making of final year medical students. Forty to fiftyclinical videos and images supported by rich text vignette and reviewed bysubject matter experts were selected based on examination blueprints foranalysis. CIVA scores were correlated with OSCE, Direct Observation ClinicalEncounter Exam (DOCEE) and written exam scores, using the 2-sided Pearsoncorrelation analysis, and their reliability was analyzed usingCronbach’s Alpha Coefficient. Furthermore, students personallyevaluated the CIVA using a 5- point Likert scale.

Results

CIVA and OSCE scores showed a high correlation (r = 0.83) incontrast with the correlation scores of the written examination(r = .36) and the DOCEE (r = 0.35). Cronbach’sAlpha for the OSCE and CIVA for the first batch was 0.71 and 0.78. As forthe second batch it was 0.91 and 0.91 respectively. Eighty-two percent ofstudents were very satisfied or satisfied with the CIVA process, contentsand quality.

Conclusions

A well constructed CIVA type assessment with a rich authentic vignette andgood quality videos and images could be used to assess clinical reasoningand decision making of final year medical students. CIVA is an assessmenttool which correlates well with OSCE, compliments the written and DOCEE andis easier to conduct at a possibly reduced cost.

Peer Review reports

Background

Clinical competency is an outcome which entails several skills directly related tothe holistic approach to patient care. Clinical reasoning, as a construct, guideshypothesis driven history taking, physical examination, diagnosis and management.Assessing clinical reasoning as a reflection of medical students’ ability inpatient care is key to medical education. Researchers have shown that physicians useboth analytical and non-analytical “pattern recognition” approaches,alternating between both in varying degrees [1]. As a result, developing, teaching and assessing clinical reasoning hasalways been a challenge in medical education [24].

Multiple tools and instruments have been described and used in assessment of clinicalcompetencies, consequently Van der Vleuten [5] used Millers Pyramid to hierarchically place them in accordance to theirassessment outcomes, those being - Knows, Knows how, Shows and Does. Moving upwardstowards the “does” level increases the assessment of theinstrument’s authenticity, bringing it closer to reality and the workplace.The balance between instrument validity, authenticity, reliability, practicality,educational impact and cost is what all assessment systems aim to achieve [6], in order to have an appropriate impact on student learning [7].

The most ideal way to judge students’ or physicians’ clinicalcompetencies and clinical reasoning, would be through direct observation of a largenumber of real patient encounters in the normal work place environment. This isusually difficult to achieve because of the lack of suitable patients and patientsafety regulations. All commonly described clinical assessment instruments –‘A’ type Multiple Choice Questions (MCQs), Extending Matching Questions(EMQs), Key Feature Examinations, Objective Structured Clinical Examinations(OSCEs), Mini C-Ex [8], Direct Observation Clinical Encounter Examination (DOCEE) [9] have their limitations. Most assessment systems use these instruments incombination in order to achieve the criteria set by Van Der Vleuten, as mentionedearlier. The College of Medicine at the University of Sharjah, follows anintegrated, outcome based curriculum, that is conveyed to the student in a problembased learning (PBL) manner. Clinical skills are introduced in Year One andconstitute one of the several crucial vertical themes within the curriculum.

In assessing students’ clinical competencies, test blue prints, written exams,OSCEs and DOCEEs are used during the clerkship phase and exit examinations to assessa medical student’s knowledge, skills and attitudes. The OSCEs, particularlyin the clerkship phase comprise around 15 interactive “7 minute”stations using simulated patients in contrast to the DOCEEs, where the medicalstudent on average, examines four real patients. The DOCEE is designed to observeand assess the student’s full encounter with real patients. Each DOCEE stationspans over a period of 30 minutes where the examiner goes through a standardizedcheck list to assess the student’s performance. As for the student’sdepth of clinical knowledge, it is tested using 100 A-type Multiple Choice Questions(MCQs), in addition to 100 Extended Matching Questions (EMQs). All final exit examsare conducted over a period of one week.

In order to increase the validity and reliability of assessment of clinicalcompetencies and ensure wider sampling from the broad world of clinical context, wehave introduced an additional instrument known as “Clinical Images and VideoAssessment” (CIVA). It’s rationale is based on assessing students’pattern recognition competencies in generating diagnostic hypotheses and decisionmaking in patient management.

CIVA is a computer-projected, class administered test, comprising 40–50scenario rich electronic stations (slides) that demonstrate clinical images orvideos, followed by a number of short questions that relate to clinical reasoningand decision making regarding the patient that had been described in the scenario.By increasing the number of problems and practice context, the number of clinicaltasks tested increases.

The aim of this study is to evaluate the validity, reliability and thestudents’ perception of CIVA as an additional examination tool used for theevaluation of clinical reasoning and decision making.

Context

The College of Medicine, established in 2004, adopted an outcome -basedcurriculum, with its main approach to teaching being through Problem-basedLearning (PBL). The pre-clerkship phase starts in the foundation year, followingwhich students during the years after, those being year 1 and 2 are subjected toclinical work that is accompanied by community training in year 3 and continuedthrough in years 4 and 5. In addition to the acquisition of knowledge, thecurriculum focuses on introducing clinical skills training as early as yearone.

Starting from Year 2, in addition to OSCEs, CIVAs are also included to assessclinical reasoning including pattern recognition of clinical signs, decisionmaking as well as medical writing skills, such as prescriptions and referralletters. To ensure content validity, an assessment blueprint is designedbeforehand based on outcome objectives, problems and clinical tasks(Table 1) [10].

Table 1 Final MBBS exam

The Clinical Skills team (CST) is comprised of six clinical tutors; a lecturer, asenior faculty (NS) as well as clinicians from the University of SharjahHospital. The team is responsible for the assessment of clinical skills. Theteam director is a member of the assessment committee, which is responsible forformulating each exam’s master blueprint and monitoring its quality. TheCST is responsible for developing and implementing OSCEs and CIVAs in accordancewith the master test blueprint, which includes all domains assessed by differentassessment tools, written A-type, Extended Matching, OSCE, CIVA and DOCEE. EachOSCE and CIVA is reviewed by the assessment committee for validity and mappingbased on the master exam blueprint. The CST has developed a large database ofclinical videos and images which are revised frequently for face and contentvalidity. Videos and images are selected to assess outcomes of clinicalreasoning and decision making (diagnosis, management, follow-up). Thevideos/images are supplemented with a rich but short clinical vignette which isrevised for authenticity by subject matter experts (consultant clinicians). Inthe final exit examination at the end of the clerkship phase, CIVA is comprisedof 50 stations (Table 2). For each image, thequestions as well as the model answers are reviewed meticulously to preventoverlap. The CIVA and OSCE are offered on the same examination day. Students aredivided into two groups which alternate in taking the OSCE and CIVA. A few daysprior to the test, the CIVA is pilot tested to ensure good quality of images andsounds and to estimate the time required. The CIVA is offered to a group of25–30 students. Students write their answers in a structured answer books,which are then marked by members of the CST according to the model answers andcross checked by second assessors. The weight of each station differs accordingto the number of questions related to that station. The time required forpreparing the CIVA station varied according to the complexity of the station.Once developed and verified, the CIVA is saved in a CIVA bank for future use.The average time for displaying/answering each station is two minutes and thetime needed for marking ranges between 30 to 60 seconds.

Table 2 Examples of CIVA stations, University of Sharjah, College ofMedicine / Clinical Skills Program

Methods

Two final year students batches (n = 52 and n = 95) sittingfor the final exit examination were studied. Students' perception of the CIVA wasobtained though a questionnaire. Quantitative responses using a 5-point Likert scale(strongly disagree, agree, undecided, agree and strongly agree) were recorded. Thequestion items were related to the clinical problem, quality of the image and videoand time allowed for each station. The overall response rate was 72%.

Qualitative data was collected from the free response of the students and analyzed toidentify common emerging themes.

Statistical analysis

Students’ final scores, for the two batches, in CIVA, OSCE, DOCEE and MCQwere subsequently compiled in one file for further analysis. Pearson CorrelationCoefficient (r) was used to measure the Correlation of CIVA with the OSCE, DOCEEand MCQ. The level of significance was set at 5%. The Cronbach’s AlphaCoefficient was calculated as a measure of exam reliability. Because questionitems in the CIVA and OSCE exams differed in number and scoring among the twobatches, a separate reliability measurement was conducted for each.

Results

Table 3 and Figure 1 show thecorrelation (r) between CIVA and OSCE, the DOCEE and written exams grades. Thestrongest correlation was found to be between CIVA and OSCE (r = 0.83,p < 0.001). However, CIVA grades correlate less with written forms(r = 0.36, p < 0.001) and DOCEE grades(r = 0.35, p < 0.001). Cronbach’s Alpha for theOSCE and CIVA for the first batch of students was 0.71 and 0.78 and for the secondbatch was 0.91 and 0.91 respectively, indicating good reliability.

Table 3 Correlations coefficient (r) between CIVA and other modes of assessment(N of students = 147)
Figure 1
figure 1

Scatter plots for correlations showing Pearson’s correlation(r).

Qualitative feedback following CIVA from the majority of the students was positivefor both educational and technical quality. The majority of students agreed/stronglyagreed that: the overall knowledge tested was fair (86%), was relevant andcorrelated well to the curriculum (89%) and reflected common case scenarios (84%).Technically CIVA ran smoothly (93%), sound and video effects were very good (96%)and time allocated was fair (80%). Suggested areas for improvement were“time allocated was too long, I could have done it in 30minutes”, “few stations have a lot of time, others is no time,but it was good overall”, “ better quality ofpictures”, “too many stations I felt sleepy”, “somex-rays were not clear”, “one slide for the eye was notobvious”.

Discussion

The assessment method of Clinical Images and Videos Assessment (CIVA) at theUniversity of Sharjah can be considered as a valid and reliable additional method inthe assessment toolbox. It provides an opportunity to test a large sample ofclinical skills in a short time with little cost and resources. The specific skillsof clinical diagnosis for CIVA include pattern recognition of signs, interpretationand decision making as well as medical writing skills such as referral letters andprescription writing. The topics chosen were based upon the defined curriculumoutcome competencies, in line with GMC learning outcomes [11] and the assessment of these outcomes [7], uses the principles of the Millers Pyramid [12] in moving assessments higher up the hierarchical scale towards therealism of the clinical situation [5]. This model is so far useful with regard to assessment using differentassessment instruments that are appropriate at each level of the pyramid. The CIVAlike the OSCE is more appropriate to the second and third level of the pyramid.

CIVAs and OSCEs assess similar constructs of clinical reasoning and decision making,but the value of CIVA is the increased size and number of tasks in differentsimulation contexts and presentations such as “emergencies” which aredifficult and more expensive to simulate in OSCEs, such as seizures, severe asthma,acute cardiac conditions and trauma. However, CIVA is not an alternative tool toOSCE, as it is not ideal for assessing communication skills, history taking,physical examination or procedural skills. The main advantage of CIVA is thestandardized mean of assessing a large number of students in a short time.

The high correlation with OSCE (r = 0.83, p <0.001) indicatessimilarity of constructs assessed including clinical reasoning and decision making.It could also be due to a priming effect because both tests were administered on thesame day. However, the low correlations with written (r = 0.36, p<0.001) and DOCEE grades (r = 0.35, p <0.001) reflect thedifferences of the constructs measured. The written exams are an effective mean ofassessing knowledge and other domains, and the DOCEE evaluates the holistic approachto patient’s care.

Although CIVA is cheaper to administer, it does require significant time for gettingit right before adding the station on the database and for marking the answerbooklets.

Conclusion

CIVA provides an excellent reliable and valid tool to be added to the assessment toolbox in order to enhance the overall assessment of how competent medical students areat various phases of the curriculum. It assesses a large number of clinical signsbased on real patients. When well designed and enhanced with rich text and a realpatient’s scenario, it can replace several OSCE stations and overcome somelogistical issues encountered during OSCEs and DOCEEs. It is cheaper and requiresmuch less personnel than the OSCE. However, it needs significant time to review inorder to improve content validity and to mark the answer books.

References

  1. Norman G, Young M, Brooks L: Non-analytical models of clinical reasoning: the role of experience. Med Educ. 2009, 41 (12): 1140-1145.

    Google Scholar 

  2. Farmer EA, Page G: A practical guide to assessing clinical decision making skills using the keyfeature approach. Med Educ. 2005, 39: 1188-1194. 10.1111/j.1365-2929.2005.02339.x. PubMed Abstract | Publisher Full Text.

    Article  Google Scholar 

  3. Elstein AS: Thinking about diagnostic thinking: a 30-year perspective. Adv Health Sci Educ Theory Pract. 2009, 14: 7-18. 10.1007/s10459-009-9184-0.

    Article  Google Scholar 

  4. Croskerry P: A universal model of diagnostic reasoning. Acad Med. 2009, 84: 1022-1028. 10.1097/ACM.0b013e3181ace703.

    Article  Google Scholar 

  5. Van der Vleuten CPM: Where are we going with assessment and why? Conference on Evaluation inMedical Education. 2002, Copenhagen: The Danish Medical Association, 7-8.

    Google Scholar 

  6. Van der Vleuten CP, Schuwirth LW, Scheele F, Driessen EW, Hodges B: The assessment of professional competence: building blocks for theorydevelopment. Best Pract Res Clin Obstet Gynaecol. 2010, 24: 703-719. 10.1016/j.bpobgyn.2010.04.001.

    Article  Google Scholar 

  7. Shumway JM, Harden RM: AMEE Guide No. 25: The assessment of learning outcomes for the competent andreflective physician. Med Teach. 2003, 25 (6): 569-584. 10.1080/0142159032000151907.

    Article  Google Scholar 

  8. Norcini John J: The Mini Clinical Evaluation Exercise (mini-CEX). Clin Teach. 2005, 2 ((1): 25-30.

    Article  Google Scholar 

  9. Hamdy H, Kameshwar P, Williams R, Slih FA: Reliability and validity of the direct observation clinical encounterexamination (DOCEE). Med Educ. 2003, 37: 205-212. 10.1046/j.1365-2923.2003.01438.x.

    Article  Google Scholar 

  10. Hamdy H: Blueprinting for the assessment of health care professionals. Clin Teach. 2006, 3 (3): 175-179. 10.1111/j.1743-498X.2006.00101.x.

    Article  Google Scholar 

  11. GENERAL MEDICAL COUNCIL: Tomorrow’s Doctors: Education Outcomes and Standards forUndergraduates. Med Educ. 2009, ISBN 978-0-901458-360-0.

    Google Scholar 

  12. Miller GE: The assessment of clinical skills/competence/performance. Acad Med. 1990, 65: S63-7. 10.1097/00001888-199009000-00045.

    Article  Google Scholar 

Pre-publication history

Download references

Acknowledgement

We thank Professor Trevor Gibbs for his review, advice and edits, Amal Hussainfor the statistical analysis, Dr Ayad Moslih Al-Moslih for his valuabletechnical work and the clinical skills team: Drs Sweety Kumari, Sarah Shorbagi,Limya Mohammed El Tayeb, Maisoon Mairghani, Karem Harb, Ghada Krizam, HendSadek, Samiya Ashraf, Nuha Yousif, Osama Hamdoun and Delan Hishyar (SharjahUniversity) for their contribution in preparing images and videos andorganization.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nabil D Sulaiman.

Additional information

Competing interests

The authors declare that they have no competing interests.

Authors’ contributions

NS conceived of the study and participated in its design and coordination as well asinterpretation of results and helped to draft the manuscript. HH participated in thestudy design, interpretation of results and helped to draft and revise themanuscript critically. Both authors read and approved the final manuscript.

Authors’ original submitted files for images

Below are the links to the authors’ original submitted files for images.

Authors’ original file for figure 1

Rights and permissions

This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative CommonsAttribution License (http://creativecommons.org/licenses/by/2.0), whichpermits unrestricted use, distribution, and reproduction in any medium, provided theoriginal work is properly cited.

Reprints and permissions

About this article

Cite this article

Sulaiman, N.D., Hamdy, H. Assessment of clinical competencies using clinical images and videos“CIVA”. BMC Med Educ 13, 78 (2013). https://doi.org/10.1186/1472-6920-13-78

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/1472-6920-13-78

Keywords