Skip to main content

Does access to a portable ophthalmoscope improve skill acquisition in direct ophthalmoscopy? A method comparison study in undergraduate medical education



Direct ophthalmoscopy (DO) is an essential skill for medical graduates but there are multiple barriers to learning this. Medical students and junior doctors typically lack confidence in DO. Most students do not own an ophthalmoscope and learn via ward devices that vary in design and usability. The Arclight ophthalmoscope (AO) is an easy to use, low-cost and portable device that could help address device access. This study aimed to assess the impact of personal ownership of an AO on DO skill acquisition and competency amongst medical students in the clinical environment.


Method comparison study with 42 medical students randomised to either traditional device ophthalmoscope (TDO) control or AO intervention group during an 18-week medical placement. Three objective assessments of DO competency were performed at the beginning and end of the placement: vertical cup to disc ratio (VCDR) measurement, fundus photo multiple-choice questions (F-MCQ) and model slide examination (MSE). DO examinations performed during the placement were recorded via an electronic logbook.


Students in both groups recorded a median number of six examinations each during an eighteen-week placement. There was no statistically significant difference between the groups in any of the objective assessment measures (VCDR p = 0.561, MCQ p = 0.872, Model p = 0.772). Both groups demonstrated a minor improvement in VCDR measurement but a negative performance change in F-MCQ and MSE assessments.


Students do not practice ophthalmoscopy often, even with constant access to their own portable device. The lack of significant difference between the groups suggests that device access alone is not the major factor affecting frequency of DO performance and consequent skill acquisition. Improving student engagement with ophthalmoscopy will require a more wide-ranging approach.

Peer Review reports


Direct ophthalmoscopy is an essential skill for medical graduates as outlined by the General Medical Council (GMC) and supported by the Royal College of Ophthalmologists. [1, 2] Specific ophthalmic problems are estimated to make up approximately 1.46–6% of UK Emergency Department attendances and 1.5% of GP consultations. [3, 4] Timely and accurate DO can be life-saving in some patients, for example in recognising papilloedema. [5] DO is also required in the management of chronic multi-system diseases such as diabetes mellitus and hypertension.

Despite the importance of and frequent need to perform DO, there are multiple barriers to learning this skill at an undergraduate level. [6, 7] Ophthalmology is not a compulsory clinical attachment for all UK medical schools and consequently some students graduate without any ophthalmoscopy exposure. [8] Limited dedicated ophthalmic curricula time is a common finding globally affecting medical schools in both high and low resource countries. [9, 10] Perhaps unsurprisingly, cross-sectional studies highlight that medical students’ self-reported confidence in DO can be low. [10] These findings are continued after graduation, with UK studies of Foundation Year and ED doctors highlighting that the majority lack confidence using an ophthalmoscope correctly and in identifying pathology. [11, 12]

Another barrier to students to learning DO is limited assessments. Objective assessment of DO is difficult due to the inherent challenge that examiners cannot easily determine how well students can view a subject’s fundus. [13] Assessment drives learning behaviour and time-pressured medical students will inevitably prioritise knowledge and skills that they will be assessed on. A 2011 survey of UK medical schools highlighted that only 38% undertook formal assessment of students’ ophthalmoscopy skills. [8] Assessments and simulation models used may lack both objectivity and validity. [14]

Device access may be a major barrier to improving frequency of DO performance and associated skill acquisition. Most UK medical students do not own a direct ophthalmoscope or have easy access to a functioning device on hospital placements. Ownership of ophthalmoscopes amongst students fell dramatically following removal of equipment grants in 1986. [15] Subsequent students have therefore entered a learning environment where the norm is not to have their own device. The cost of a traditional direct ophthalmoscope (TDO) such as a Keeler standard model is around £220 and considered prohibitively expensive to most undergraduates. [16] Availability of ophthalmoscopes in hospital attachments is recognised to be limited. This is multi-factorial: NHS procurement can lack consistency in which models are purchased and ward staff may not provide ongoing maintenance leading to non-functioning devices due to burst bulbs or flat batteries. These issues present further challenges to skill mastery. [11]

The Arclight (AO) is a device that offers promise in overcoming these barriers. It is a highly portable (11 cm long and weighing 18 g) solar powered, LED illuminated ophthalmoscope. In the UK it costs approximately £50, a significant reduction compared to TDOs. [17] (Fig. 1). Despite its low cost, previous studies have shown it to be as good as TDOs with the majority of users finding it easier to use. [17,18,19]

Fig. 1

Arclight Ophthalmoscope

Consequently, the aim of this study is to assess the impact of personal ownership of a portable ophthalmoscope (AO) on DO skill acquisition and competency amongst medical students in the clinical environment compared to a control group with typical access to TDOs.



We used a mixed methods design, primarily in the form of a method comparison study supported by a qualitative survey. Ethical approval was granted by the University of Birmingham ethics board in October 2016 (Refence: ERN_16–1021).

Setting and participants

The study was performed amongst fourth year MBChB medical students at the University of Birmingham during the period November 2016 to April 2017. Participants were all undertaking their 18-week Specialty Medicine (SPM) hospital placement. SPM is a mandatory clinical attachment which involves rotation through different specialities including one to two weeks of ophthalmology. Students are randomly allocated between eight different hospitals across the West Midlands.

Recruitment and randomisation

All 178 4th year medical students undertaking the SPM placement at the time of study recruitment were invited to participate via email. Students were offered a free AO for taking part in the study. The only additional eligibility criterion applied was that students were required to have a refractive error between -6D and + 4D to participate. This was to match the capacity of the AO to correct refractive error and is in keeping with previous studies. [17]

A total 42 students (24% response rate) were successfully recruited and individually randomised by the primary investigator (PI) using computerised random numbers to either the control or intervention arms. Three objective DO competency assessments were planned before the students started their 18-week SPM placement and were to be repeated at the end. The students in the intervention arm were given an AO to use throughout the study period and keep afterwards. Students in the control arm received their AO at the end of the study. All participants also then received individualised feedback in the form of their raw assessment scores. These were not graded or linked with any assessments within the MBChB programme.


Students randomised to the control group used TDOs during both the pre and post clinical attachment assessments. During their clinical attachment they only used the TDOs typically available in the hospitals of their SPM placements.


Students randomised to the intervention group all used their own personal AO during both assessments and their SPM placements. Students could replace lost or broken AOs by contacting the PI.


Three primary assessments of DO competency were performed on all participants at the study beginning and end: judgement of vertical cup disc ratio (VCDR), fundus multiple choice questions (F-MCQs) and model slide regional examination (MSE). VCDR, F-MCQ and EOU all necessitated performing ophthalmoscopy on other study participants, while MSE consisted of examining pre-generated fundal images on 35 mm slides in eye models. Students were all emailed information about the DO devices they would be using and the different assessments 2 weeks before the baseline assessment. No information was given on how to perform DO and no teaching was delivered on the day of assessments. Students were given ten minutes to familiarise themselves with their allocated device prior to the baseline assessments.

Students also self-assessed their examination competence for each ophthalmoscopy examination carried out on another study participant. This was via an ‘Ease of Use’ (EOU) scale used in a previous study, which ranged from 1 (‘Couldn’t use this ophthalmoscope’) to 8 (‘Determined a cup: disc ratio with a low level of difficulty). [20] This scale is included in Appendix 1.

Model slide regional examination (MSE)

This assessment used fundus photo slides annotated with letters of various font sizes printed in different positions on the retina and placed within a mannequin (Eye Retinopathy Trainer®, Adam, Rouilly Co., Sittingbourne, UK). Each participant examined six model eyes each with six letters in the same pre-defined retinal locations but with reducing font sizes. Scores were calculated as a percentage total of the correct answers.

Fundus photography

After recruitment, all participating students had fundus photographs taken of both their eyes by the PI using a Topcon® retinal fundus camera. These photographs were cropped to illustrate the optic nerve in the centre of an image with a one disc diameter surrounding area of retina and used to generate the F-MCQs.

Fundus multiple choice questions (F-MCQs)

F-MCQ assessment sessions required every student to perform ophthalmoscopy on every other student. The examining student was required to identify the optic nerve of the student being examined. Specifically, each student had two F-MCQs (one for each eye) each with four images: their previously acquired optic nerve head image and three non-matching distractors from other participating students. See Fig. 2 for an example. One mark was awarded for a correct match and zero for an incorrect match.

Fig. 2

F-MCQ example

Vertical cup to disc ratio (VCDR)

Participants were requested to assess and record the VCDR of each eye examined. Three ophthalmic specialists (AB, RB and PIM) provided VCDR assessments for all optic nerve head images from the participants. The mean of these assessments was used to form the ‘gold standard’ from which participant results were compared. Students scored a mean magnitude error based on the comparison of each of their assessments to the gold standard.

Electronic logbook

Students kept an electronic logbook (e-logbook) of all DO examinations they performed during their 18-week placement including EOU scores. Students coded this data during placement using a simple online application accessible via smart phones. Participants were contacted by email at six points during the study period and reminded to code examinations.

Statistical analysis

Quantitative data was analysed according to a per protocol principle using the software SPSS Statistics (Version 24, IBM®). Comparison of baseline characteristics, including gender, refractive error and hospital placement was undertaken using Chi-squared test and Fisher’s exact test. Median/mean differences in DO competency were compared using the Wilcoxon Signed Rank Test for non-parametric and the Paired Samples t-test for parametric data. Intra class coefficients (ICCs) were used to measure the agreement of assessments in performance ranking participants. Correlations between performance and other independent factors were analysed using Spearman’s Rank Test.


A total of 38 students (21% of cohort) completed the study (Fig. 3). Comparison of baseline characteristics including gender, refractive error and hospital placement demonstrated no statistically significant difference between the groups (Appendix 2).

Fig. 3

Study Flow Diagram

The e-logbook demonstrated no difference in the median number of examinations performed by the AO group compared to control (6.0 vs 6.0) (Table 1). The greater mean number of examinations performed by the AO group vs control (9.6 vs 7.0, p = 0.41) was due to a small minority (n = 3) of students in the AO performing large numbers of examinations.

Table 1 E-logbook number of examinations

There was a minor reduction in the magnitude of VCDR judgement error in both groups; intervention − 0.12 (CI − 0.18 to − 0.05) vs control − 0.08 (CI − 0.15 to − 0.02) (Table 2). Both groups performed worse in the end assessments compared to their baseline assessments in F-MCQ and MSE assessments; intervention − 16.7 (IQR − 18.7 to 10.4, p < 0.01) vs control − 7.1 (IQR − 21.4 to − 1.8, p < 0.01) and intervention − 12.5 (IQR − 25 to 0, p < 0.01) vs control − 12.5 (IQR − 25 to − 12.5, p < 0.01) respectively. There was no statistically significant difference between these assessed competency changes (VCDR p = 0.561, MCQ p = 0.872, Model p = 0.772). The AO group demonstrated statistically significant increased EOU scores of 0.24 (CI 0.08 to 0.39) vs control 0.04 (− 0.14 to 0.24). Notably the AO also performed better at the F-MCQ assessments at baseline 58.3% vs control 42.9% (p = 0.013) and at final 45.8% vs control 35.7% (p = 0.043). There was no difference in scores between groups across the other assessment modalities. ICCs demonstrated no significant performance rank correlation between the assessments (VCDR/MSE 0.124, VCDR/F-MCQ -0.111, MSE/F-MCQ 0.096).

Table 2 Comparison of baseline and final outcome assessments


The key finding from our study was the low numbers of DO examinations performed by both groups; median of six during the 18-week clinical attachment which included 1 to 2 weeks of ophthalmology. The low number of examinations is particularly striking given that participants self-selected for study involvement, knew they were being observed and the intervention group were given free portable ophthalmoscopes. Students may have simply failed to record examinations, although this seems unlikely given the potentially positive effect of observer bias and easy access to the smartphone-based e-logbook.

A limitation of studies in this field of research is a lack of a validated objective measure of ophthalmoscopy skills at an undergraduate level. We chose a range of assessments to provide an overview of student performance in an attempt to overcome this. VCDR and EOU scoring [17, 20], F-MCQ [13, 21] and MSE [22, 23] have all been used in similar studies before but not directly compared or formally validated for assessing competence.

Similar competency results were observed between intervention and control groups across all three assessments. Both groups demonstrated a minor improvement in VCDR judgement but a reduction in F-MCQ and MSE performance. Students generally found VCDR assessment challenging, which is not surprising given the significant assessment variation even amongst ophthalmic specialists. [24] Given the lack of correlation with number of examinations, the minor improvement seen in VCDR judgement was likely due to general ophthalmology placement experiences or personal study rather than DO practice. (Appendix 3) Our results suggest F-MCQs may show promise going forwards as they were the only assessment modality to positively correlate with the number of examinations performed. (Appendix 3).

Anecdotally, students reported finding the second set of MSE slides harder to visualise. This was confirmed by the PI and was likely due to variation in the print quality or letter type. The reduction in performance scores in the final MSE assessments may have been in part due to this. This was not the case for F-MCQs as the same questions were used at both baseline and final assessment. MSE has inherent limited construct validity and in our study appeared to be affected by variances in difficulty. There was also a significant correlation with refractive error i.e. students with greater refractive error performed worse in MSE assessments than their peers, which suggests this is a source of performance bias for MSE.

VCDR, F-MCQ and MSE appeared to be testing different aspects of DO competency. This is supported by a lack of significant intra class coefficient (ICC) between any of the assessments. Further research is required to develop a fit for purpose objective measure for DO competency at the undergraduate level.

The AO may provide some performance advantage over traditional models. Despite the lack of impact of the AO on number of examinations and DO skill acquisition, our study confirmed non-inferior performance of the AO versus TDO in 2 of the 3 objectively assessed modalities and higher F-MCQs scores at both the baseline (58.3% vs 42.9%) and final assessments (45.8% vs 35.7%). Furthermore, there was a statistically significant increase in self-assessed EOU score for students using the AO.

Further research should aim to explore students’ attitudes towards and experience of practising ophthalmoscopy to help identify what barriers to DO skill acquisition are present at an undergraduate level and how to address these. One factor not addressed by this study is clinical supervision and availability of experienced supervisors. Junior doctors often provide frontline clinical teaching but if they lack confidence in their own ophthalmoscopy skills this may lead to a reluctance to support and guide students. [11]

Strengths and limitations

The strengths of this study were randomising students into a control group and intervention group, use of novel technology and collection of longitudinal data on clinical attachment combined with assessment data. We acknowledge the following limitations:

  • This research was carried out at one institution only so will reflect the curriculum and clinical experience available.

  • The study may be underpowered due to a relatively small analysed sample size (n = 38). Without any similar previous or pilot studies it was not possible to perform a reliable power calculation.

  • Due to the nature of the intervention, it was not possible to mask either the educators or students to which device was being used by each group.

  • Students’ ophthalmology week took place during any one of the 18 weeks of SPM attachment and we did not record when this took place for individual students. To what degree the timing of this week affected results is unknown. For example, students who had their ophthalmology week first may have been more confident performing ophthalmoscopy in the rest of the block and vice versa.

  • The assessment measures lacked validation, particularly the F-MCQs. For each F-MCQ distractor images were picked to provide contrast for example different vasculature or VCDR but this limited standardisation and questions may have varied in difficulty.

  • E-logbook data was self-reported. Students may have under-reported or entered false examinations.


In our study, personal ownership of a portable ophthalmoscope offered limited advantage over traditional models. Students did not practice DO frequently, even with access to their own portable device. This was reflected in a lack of any meaningful improvement in DO skill over the study period. The AO represents a suitable alternative to more expensive traditional devices, but our results suggest changing student engagement with ophthalmoscopy will require a more wide-ranging approach than improving device access alone.

Availability of data and materials

The datasets used and/or analysed during the current study are available from the corresponding author on reasonable request.



Arclight ophthalmoscope


Direct ophthalmoscopy


Ease of use score


Fundus multiple choice question


Intra class coefficient


Model slide examination


Speciality medicine


Traditional direct ophthalmoscope


Vertical cup disc ratio


  1. 1.

    Outcomes for graduates 2018: General Medical Council; 2018 [Available from: accessed 29/10/2018].

  2. 2.

    'Eye and Vision Curriculum' for Undergraduate and Foundation Doctors: Royal College of Ophthalmologists; 2014 [Available from: accessed 29/10/2018].

  3. 3.

    Ophthalmic Service Guidance: Emergency eye care in hospital eye units and secondary care: Royal College of ophthalmologists; 2017 [Available from: accessed 29/10/2018].

  4. 4.

    Sheldrick JWA, Vernon S, Sheldrick C. Management of ophthalmic disease in general practice. Br J Gen Pract. 1993;43(376):459–62.

    Google Scholar 

  5. 5.

    Dyer C. Court overturns optometrist's conviction for gross negligence manslaughter. BMJ. 2017;358:j3749.

    Article  Google Scholar 

  6. 6.

    Yusuf IH, Salmon JF, Patel CK. Direct ophthalmoscopy should be taught to undergraduate medical students-yes. Eye (London, England). 2015;29(8):987–9.

    Article  Google Scholar 

  7. 7.

    Yusuf IYE, Knight K, Leaver L. Direct ophthalmoscopy: teaching in primary care. Clinical Teacher. 2016;13:235–7.

    Article  Google Scholar 

  8. 8.

    Baylis O, Murray PI, Dayan M. Undergraduate ophthalmology education - a survey of UK medical schools. Medical Teacher. 2011;33(6):468–71.

    Article  Google Scholar 

  9. 9.

    Megbelayin E, Asana E, Nkanga G, Duke R, Ibanga A, Etim A, et al. Evaluation of competence of medical students in performing direct ophthalmoscopy. Niger J Ophthalmol. 2014;22(2).

    Article  Google Scholar 

  10. 10.

    Gupta RR, Lam WC. Medical students' self-confidence in performing direct ophthalmoscopy in clinical training. Can J Ophthalmol. 2006;41(2):169–74.

    Article  Google Scholar 

  11. 11.

    Cahill V, Willetts E, Nicholl D. 036 the TOS study: can we improve the ophthalmological assessment of medical patients by foundation year doctors? Journal of neurology. Neurol Psychiatr. 2012;83(3):e1.199–e1.

    Article  Google Scholar 

  12. 12.

    Murray PI, Benjamin M, Oyede O. Can general a&E doctors manage common eye emergencies? Eye (London, England). 2016;30(10):1399–400.

    Article  Google Scholar 

  13. 13.

    Afshar A, Oh F, Birnbaum A, Namavari A, Riddle J, Djalilian A. Assessing ophthalmoscopy skills. Ophthalmology. 2010;117(9):1863.

    Article  Google Scholar 

  14. 14.

    Ricci H, Ferraz CA. Ophthalmoscopy simulation: advances in training and practice for medical students and young ophthalmologists. Advances in Medical Education and Practice. 2017;8:435–9.

    Article  Google Scholar 

  15. 15.

    McNaught A, Pearson R. Ownership of direct ophthalmoscopes by medical students. Med Educ. 1992;26:48–50.

    Article  Google Scholar 

  16. 16.

    Health Care Equipment and Supplies Co Ltd 2018 [Accessed online 11 September 2018]. Available from:

  17. 17.

    Lowe J, Cleland CR, Mgaya E, Furahini G, Gilbert CE, Burton MJ, et al. The Arclight ophthalmoscope: a reliable low-cost alternative to the standard direct ophthalmoscope. J Ophthalmol. 2015;2015:743263.

    Article  Google Scholar 

  18. 18.

    Blaikie A, Sandford-Smith J, Tuteja SY, Williams CD, O'Callaghan C. Arclight: a pocket ophthalmoscope for the 21st century. BMJ. 2016;355:i6637.

    Article  Google Scholar 

  19. 19.

    Blundell R, Roberts D, Fioratou E, Abraham C, Msosa J, Chirambo T, et al. Comparative evaluation of a novel solar powered low-cost ophthalmoscope (Arclight) by eye healthcare workers in Malawi. BMJ Innov. 2018;4(2):98–102.

    Article  Google Scholar 

  20. 20.

    Mandal N, Harborne P, Bradley S, Salmon N, Holder R, Denniston AK, et al. Comparison of two ophthalmoscopes for direct ophthalmoscopy. Clin Exeprimental Ophthalmol. 2011;39(1):30–6.

    Google Scholar 

  21. 21.

    Kwok J, Liao W, Baxter S. Evaluation of an online peer fundus photograph matching program in teaching direct ophthalmoscopy to medical students. Can J Ophthalmol. 2017;52(5):441–6.

    Article  Google Scholar 

  22. 22.

    Akaishi Y, Otaki J, Takahashi O, Breugelmans R, Kojima K, Seki M, et al. Validity of direct ophthalmoscopy skill evaluation with ocular fundus examination simulators. Can J Ophthalmol. 2014;49(4):377–81.

    Article  Google Scholar 

  23. 23.

    Bradley P. A simple eye model to objectively assess ophthalmoscopic skills of medical students. Med Educ. 1999;33:592–5.

    Article  Google Scholar 

  24. 24.

    Watkins R, Panchal L, Uddin J. Gunvant, Pinakin BS. Vertical cup-to-disc ratio: agreement between direct Ophthalmoscopic estimation, fundus biomicroscopic estimation and scanning laser Ophthalmoscopic measurement. Optom Vis Sci. 2003;80(6):454–9.

    Article  Google Scholar 

Download references


We would like to thank Sir Robert Devereux for his work coding the examination e-logbook, an application which may also facilitate data collection in future research. We would also like to thank all the students involved in the study as well as staff at the Queen Elizabeth Hospital Birmingham Ophthalmology Department for allowing us to use their retinal photography facilities.


AB is seconded to the University of St Andrews from NHS Fife. The University owns a social enterprise subsidiary company, for which AB acts as an unpaid adviser. The social enterprise business sells the Arclight to users in high resource countries with all profits being used to fund distribution and education exercises in low-income countries.

This company provided Arclight devices for use in the study at production cost value. The University of Birmingham then funded acquisition of the devices and provided all other practical resources necessary for the study.

Author information




JAGW conceived of the original study design and took the role of primary investigator in the acquisition, analysis and interpretation of data. AP made a substantial contribution to the acquisition of data. AB, AKD, RB, JC and PIM all made substantial contributions to the study design and interpretation of data. JAGW and AP drafted the initial manuscript and then all authors were involved in revising it critically for important intellectual content.

Corresponding author

Correspondence to J. A. Gilmour-White.

Ethics declarations

Ethics approval and consent to participate

Written consent was obtained from all participants in the study. Ethical approval was granted by the University of Birmingham ethics board in October 2016 (Refence: ERN_16–1021).

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.


Appendix 1

Table 3 EOU Score

Appendix 2

Table 4 Baseline Characteristics

Appendix 3

Table 5 Independent Factor Correlations

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (, which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Gilmour-White, J.A., Picton, A., Blaikie, A. et al. Does access to a portable ophthalmoscope improve skill acquisition in direct ophthalmoscopy? A method comparison study in undergraduate medical education. BMC Med Educ 19, 201 (2019).

Download citation


  • Undergraduate medical education
  • Ophthalmology
  • Direct ophthalmoscopy