Skip to main content

Impact of video feedback system on medical students’ perception of their clinical performance assessment

Abstract

Background

Providing feedback on student performance in the clinical performance assessment (CPA) is meaningful in that it helps students understand their strengths and weaknesses. This study compared students’ perception of their CPA scores before and after providing personalized video feedback.

Methods

Two identical online surveys of Year 1 medical students (N = 103) that had undergone CPA were conducted to evaluate students’ perceptions about their CPA scores before and after video feedback. Students were given their test scores with assessment analysis reports immediately after completing the CPA. Top-scored students from each station agreed to provide their video-recorded performance to the rest of the students.

Results

After comparing their performance video and top-scored video at each station, medical students were more aware of their CPA total score, clinical performance examination (CPX) total score, score of each CPX station, section score for the CPX station, history taking section score, physical examination section score, and doctor-patient relationship section score. Moreover, students became more convinced of their own weaknesses from their history taking and patient education section after viewing video feedback than before.

Conclusion

The use of the video feedback system might help students recognize their CPA results and identify their strengths and weaknesses.

Peer Review reports

Background

Feedback is a critical component necessary for medical students to perform effectively and in a timely manner in clinical settings [1, 2]. However, feedback often does not satisfy both students and evaluators [3,4,5,6]. Clinical performance assessment (CPA) generally employs an analytical checklist for each station that is provided by evaluators as feedback, enabling students to recognize their strengths or weaknesses via the scores of the feedback. However, students sometimes feel that the score generated from the checklist is insufficient and dose not properly assess their performance; therefore, the evaluator provides additional feedback in several other forms, including hand-written comments, immediate verbal feedback, and briefing session. Hand-written comments in CPA can provide original, good, and sometimes powerful information [7,8,9,10,11,12]. However, because the Objective Structured Clinical Examination (OSCE) and Clinical Performance Examination (CPX) are performed in 5 or 10 min, respectively, the evaluator faces a time limit when providing feedback. Moreover, there is an additional time delay in receiving feedback cards because they are delivered to students after completion of the entire CPA; accordingly, a student might not recall the situation. Another form of feedback is immediate verbal feedback, which is very effective when the evaluator makes it prompt, precise and to the point. This type of feedback is much more effective when combined with written comments [8]. Positive verbal feedback (praise) may cheer up a student, while immediate verbal feedback itself might enhance student anxiety and cause them to perform negatively in subsequent tasks. In the worst scenarios, students might lose control of their emotional behavior and receive lower scores from the remaining stations [13,14,15]. A third feedback method involves gathering all of the students into the classroom, briefing them on the overall CPA results, and pointing out the most common errors that students have made. This is timely and effective but does not provide individual feedback.

Good feedback makes students engage in the feedback process rather than the technical aspect of the feedback [4]. Moreover, good feedback requires that the student’s performance be carefully monitored [16, 17]. Therefore, Keele University School of Medicine has developed a personalized audio feedback tool that uses a hand-held digital mp3 player to improve the OSCE performance of students [18]. Although this method is convenient and acceptable to both students and evaluators, it may be somewhat difficult to grasp the meaning of the comments because the performance situation or the illustration in which the comment in the audio file is given cannot be seen [19,20,21,22,23]. Therefore, we recently developed an individualized video feedback system in addition to the online-written comments we already employed in the CPA to provide students with more feedback regarding self-learning. Providing effective feedback to medical students corresponds with the shift toward learner-centered education from teacher-centered education. We gave feedback via hand-written comments before implanting video feedback. Although this allowed teachers to feel relieved that they have provided feedback, but they did not know at whether students were receiving the feedback or the intention of teachers [24]. Similarly, like teaching and learning, the evaluator gives a feedback, and if the feedback is not accepted by those being evaluated, the feedback might be useless to the student. This study is one of a series of feedback studies, in which students’ perceptions of the evaluator’s feedback was investigated. Students may accept that the score generated from a checklist properly assessed for their performance if personalized video feedback of their performance during CPA is provided in addition to hand-written comments and an analytical checklist. Therefore, in the present study, we compared students’ perception of their score results before and after providing personalized video feedback to develop a more effective feedback method that can be applied in CPA situations of medical education.

Methods

Study participants and design

A questionnaire-based before and after study was used to survey first year medical students of Pusan National University School of Medicine in the second semester of 2012. This study was reviewed and given exempt status by the Institutional Review Board of Pusan National University Yangsan Hospital (IRB No. 05–2017-102). Because we analyzed data retrospectively and anonymously by assigning each subject a distinct number, the institutional review board did not require informed consent from participants. A total of 131 first year medical students underwent CPA including CPX and OSCE. Immediately after completing the CPA, students were given their test scores with a computer assisted assessment analysis report. The top-scored students from each station agreed to provide their video-recorded performance to the rest of the students. Basically, all students received their own video-recorded performance. In addition, videos of the best students of each station were provided to the rest of the students. This video feedback system was designed to allow students to compare the recorded video of the best student at each station with their own video-recorded performance so they could realize their strengths and weaknesses. Two identical online surveys were conducted to evaluate students’ perceptions of their CPA scores before and after the video feedback. We developed a program so that only students who responded to the first questionnaire were allowed to view their own video, followed by the recorded video of the best student, after which the students were allowed to respond to the second questionnaire. A total of 131 students answered the first questionnaire, while only 103 (78.6%) responded to the second questionnaire (Fig. 1). The questionnaire was developed based on an extensive review of the literature [23, 25,26,27] and the consensus of five faculty members in the department of medical education and 20 faculty members of Clinical Skills Committee, who were expert educators and clinical teachers. Students were unaware of the first and second survey questions before receiving the corresponding feedback.

Fig. 1
figure 1

Study flowchart

Clinical performance assessment

All students completed the CPA, which was composed of three CPX stations and three OSCE stations. Cases were selected to represent common acute conditions, chronic conditions, and counseling cases. The three CPX stations were as follows: acute abdominal pain, headache, and delivering bad news. The three OSCE stations included basic clinical skills such as muscular injection, burn dressing, and cranial nerve examination. In the CPX, Standard Patients (SPs) presented a variety of patient problems.

Each skills station was equipped with a computer assisted assessment system as an instrument for conducting the CPA. Evaluators evaluated the performance of each student and filled out an in-depth station specific online checklist. After assessing the performance of a student, evaluators added online-written comments about the main weak points. All station encounters were digitally recorded using a room equipped with a microphone and a camera encoded with H.264 standard compression. After the entire class had completed their assessment, students received a report indicating their scores for each section (history taking, physical exam, counseling and communication skills) and overall of the cases. The evaluator’s online-written comments from each station were provided to students to improve their self-directed learning skills. Students who did not attain a passing score at each station were shown a “FAIL” mark for that station, and if overall scores for all stations were below a passing range, an overall “FAIL” mark was shown for the overall assessment. Individualized feedback including their scores (pass, fail, rank, minimum, maximum, total score, standard deviation), the top-score of the best student at each station, and online-written comments were provided to CPA applicants before performing the first survey. A retake of an examination was permitted for students with the scores 1 SD below the mean. A pass/fail decision for CPA is based on the total scores 2 SD below from the mean. On average, no more than three students failed to pass this assessment.

Materials

For the development of the questionnaire, two rounds of Delphi expert consultation were conducted with five faculty members in the department of medical education and a faculty focus group (n = 10) selected among members of the Clinical Skills Committee. During the first round, experts were asked to provide their opinions in a questionnaire consisting of open-ended questions about the evaluation area and evaluation items. Items selected in the first-round analysis were presented to each expert by email in a second round, when experts were asked to use the Likert 5-point scale to evaluate whether they agreed with inclusion or exclusion of items according to the importance of each factor and item. Experts were also asked to describe the suitability of the evaluation system and the comments pertaining to the items to be revised and supplemented by the evaluation factors. Experts did not meet face-to-face, and they completed their assessments independently. The content validity was based on the content validity ratio (CVR) proposed by Lawshe [28]. The CVR ranged from a maximum of + 1.0 to a minimum of − 1.0. If the CVR was positive, more than half of the respondents answered ‘appropriate’, which meant they were rated 4 or 5 on the Likert 5-point scale. The CVR gives the minimum value according to the number of panels. When the value was above the minimum value, it is judged that there is content validity for the item. The number of panels in this study was 15, and the content validity was found to be more than 0.49. In the second round, all of the developed items were available because the average of the validity responses was 4.5 or more. Finally, the questionnaire consisted of 4 items regarding CPA total score reports including CPX and OSCE, 12 items regarding CPX score reports, 2 items regarding OSCE score reports, 2 items regarding online written comments, and 2 items regarding video feedback system. The contents of the questions are shown in Table 1 (Additional file 1). The same questionnaire was administered before and after providing video feedback. Only questions pertaining to the usefulness of video feedback were added. Answers were given on a 5-point Likert-type scale from strongly disagree to strongly agree, which is used to allow the individual to express how much they agree or disagree with each question. Two open-ended questions concerning the CPX station that students disagreed with for the CPX station score and OSCE station score were presented at the end of the questionnaire. Completion of the questionnaire took approximately 30 min.

Table 1 Effects of video feedback system on students’ perceptions regarding their clinical performance assessment (N = 103)

Statistical analysis

Descriptive statistics were used to characterize and describe the sample features. For comparisons of differences in students’ perceptions before and after providing video feedback, a paired t-test was used. Effect sizes were calculated using Cohen’s d with small, medium and large effects having the values of 0.0–0.2, greater than 0.2 to 0.5 and above 0.5, respectively [29]. Students’ perceptions regarding their own score after the total CPA score report, online-written comment, and video feedback were compared using ANOVA. The level of significance was set at 0.05 and statistical analyses were conducted using SPSS 13.0 for Windows (SPSS Inc., Chicago, IL, USA).

Results

Table 1 shows the difference in agreement and perception of students regarding their scores before and after receiving video feedback. For all questions, students’ perception was higher after viewing video feedback than before. After comparing the performance video and top-scored video for each station, medical students were more aware of their CPX total score (P = 0.011), each CPX station score (P = 0.033), CPX station section score (P = 0.017), physical examination section score (P = 0.016), and doctor-patient relationship section score (P = 0.007, Table 1). Students agreed to the total scores for the CPA and history taking section score better after viewing video feedback than before. Students were also better able to perceive their own weaknesses from the history taking section score (P = 0.096) and patient education section score (P = 0.003) after viewing video feedback than before. However, despite providing video feedback to students, there was no difference in other agreement and perception from the students’ perspective. Tables 2 and 3 show changes in the perception of students who did not agree with their own CPX station score before and after video feedback. Whether or not students agreed to agree on their scores before video feedback, most students accepted their scores after the video feedback. On the contrary, although very few, some students initially accepted their scores, but were not convinced after the video feedback. Overall, students assessed the usefulness of video feedback (4.25 ± 0.78) higher than that of the computer assisted assessment analysis report (3.80 ± 0.62) or online-written comment (3.92 ± 0.59).

Table 2 Number of students who did not agree with their own CPX station score (N = 103)
Table 3 Number of students who did not agree with their own OSCE station score (N = 103)

Discussion

This study was conducted to evaluate the effects of providing personalized video feedback of first year medical student’s performance during CPA in addition to hand-written comments on the way they perceive their score results from an analytical checklist. The developed method was designed to allow students to compare the recorded video of the best student at each station with the recorded video of the exam they were performing so they could realize what they did well and what skills they lacked. The results of the present study showed that students were more likely to agree with the analytical checklist score of their CPA after they compared the recorded video of the best student at each station with the recorded video of the exam they performed. The video feedback allowed them to realize what they did well and what skills they lacked [30]. In addition, they were more likely to accept their CPA total score, CPX total score, each CPX station score, history taking section score, physical examination section score, and doctor-patient relationship section after receiving video feedback. The satisfaction rate of the video feedback system was more than 4 out of 5. This change could be regarded as meaningful and indicates that the intervention of the video feedback seemed to have an effect on how students perceived their performance; however, care should be taken when interpreting these results. In addition, eight students (7.77%) did not agree with their CPX station score, but after video feedback, only 4.85% did not agree. Moreover, seven students (6.80%) disagreed with their OSCE station score before seeing the video, but this dropped to 3.91% after receiving the video feedback. Although more students agreed with their online-written comments after receiving video feedback, this difference was not statistically significant. Even if students complete a station assessment in less than the allotted time, it is still time-consuming for the evaluator to provide hand-written comments to the students. As a result, some critical comments may be eliminated if too many applicants are evaluated within a given time frame.

Based on these findings, video feedback was more effective than analytic checklist score or online-written comments at helping students understand CPA outcomes. In addition, the video feedback system used in this study appeared to be an improved form in that it made it possible to identify the performance situation, which was the limitation of the mp3 audio feedback tool introduced at Keele University School of Medicine. In previous studies, the video feedback system was very useful in that it could check the performance of the recorded video and provide feedback [31]. Lindon-Morris and Laidlaw [32] reported that student’s perceived their self-awareness to be unfavorable to their performance in the presence of the video camera, but that they could compare their videos with those of other students to monitor their performance more accurately and refer to their students’ communication strategies to modify their own communication strategies during clinical communication training using technology including video feedback. In a previous study in the field of nursing, video-feedback showed changes in communication, clinical competence and motivational interviewing skills of prospective nurses [33]. In addition, the experimental group that received video feedback had higher scores for knowledge, performance competence of core basic nursing skills, self-efficacy, learning motivation, and learning satisfaction than control groups that did not receive video feedback in previous studies [34, 35].

It should be noted that video feedback does not always have a positive effect, and that it can produce different learning effects depending on how it is provided to the learner. Specifically, video feedback should be provided to learners in combination with other additional methods to generate positive learning effects [36]. In addition, attention-focusing cues should be given before the video is presented and combined with error-correction information to provide the learner with the information [37]. It would also be helpful to combine other feedback methods with videos of professional models for use as templates for comparison to one’s own videos to detect errors [38]. In this study, changes in student perception in OSCE as a result of video feedback were not statistically significant. Accordingly, it is necessary to carefully consider how to provide video feedback. Even though it was a very small percentage, some students agreed to their CPX scores of ‘acute abdominal pain’ and ‘delivering bad news’ sections before viewing the video feedback, but after viewing the video feedback, they did not accept their score unlike our expectations. The advantage of the video feedback system developed in this study is that it enabled students to compare their performance with that of the best students, which allowed them to recognize the reasons for their CPA results, develop their strengths, and complement their weaknesses. However, although the video feedback system used in this study allowed learners to see their own strengths and weaknesses in the previous examination, it did not include direct feedback on error corrections or regarding what to do in the next examination. Moreover, the results of video-feedback could not be acknowledged because online-written feedback and video feedback were presented to students separately in binary form. Therefore, it will be necessary to address this issue in the future to enable continued development of the video feedback system. Also, effectiveness of the video feedback system for improving clinical performance, stakeholder feedback for successful video feedback systems, or comparison among different feedback systems need to be conducted in the future.

It should be noted that this study was limited in that acceptance of the test score was part of the overall feedback system acceptance, which may not be sufficient alone, because this is an indirect measure that requires more caution when interpreting the results. Feedback might be useful if it is accepted by those being evaluated. However, the results of this study revealed some students who, although accepting their scores at first, no longer accepting them viewing the video feedback. Accordingly, additional interviews should be conducted to ensure that students understood their scores well and considered the test results to be fair and appropriate; unfortunately, however, such interviews were outside the scope of this study.

Conclusions

In summary, the results of this study suggest that the use of a video feedback system in CPA of medical education can help students recognize their CPA results and identify their strengths and weaknesses. Future studies should include development of a video feedback system that complements the educational usefulness derived from the results of this study so that it can be used more actively in medical education. Additionally, a more realistic and direct personalized feedback system needs to be introduced into clinical skill education in the future.

Availability of data and materials

The datasets used and analyzed during the current study are available from the corresponding author on reasonable request.

Abbreviations

CPA:

Clinical performance assessment

CPX:

Clinical performance examination

OSCE:

Objective structured clinical examination

References

  1. Debisette AT, Sandvold I, Easterling B, Martinelli A. An integrative review of nursing workforce studies. Annu Rev Nurs Res. 2010;28:317–38.

    Article  Google Scholar 

  2. Stone H, Angevine M. Sivertson S. A model for evaluating the history taking and physical examination skills of medical students. Med Teach. 1989;11(1):75–80.

    Article  Google Scholar 

  3. Murdoch-Eaton D, Sargeant J. Maturational differences in undergraduate medical students’ perceptions about feedback. Med Educ. 2012;4:711–21.

    Article  Google Scholar 

  4. Ramani S, Krackov S. Twelve tips for giving feedback effectively in the clinical environment. Med Teach. 2012;34:787–91.

    Article  Google Scholar 

  5. Carless D. Trust and its role in facilitating dialogic feedback. In: Boud D, Molloy L, editors. Feedback in higher and professional education. London: Routledge; 2010. p. 90–103.

    Google Scholar 

  6. Shrivastava S, Shrivastava P, Ramasamy J. Effective feedback: an indispensable tool for improvement in quality of medical education. JPD. 2014;4:12–20.

    Google Scholar 

  7. Bailey C, Minderhout V, Loertscher J. Learning transferable skills in large lecture halls: implementing a POGIL approach in biochemistry. Biochem Mol Biol Educ. 2011;40:1–7.

    Article  Google Scholar 

  8. Beaumont C, O'Doherty M, Shannon L. Reconceptualising assessment feedback: a key to improving student learning? Stud High Educ. 2011;36:671–87.

    Article  Google Scholar 

  9. Carless D. Sustainable feedback and the development of student self-evaluative capacities. In: Merry S, Price M, Carless D, Taras M, editors. Reconceptualising feedback in higher education: developing dialogue with students, vol. 2013. London: Routledge; 2013. p. 113–22.

    Google Scholar 

  10. Carless D. Excellence in university assessment: learning from award-winning practice. London: Routledge; 2015.

    Book  Google Scholar 

  11. Schoenmakers B, Ryssaert L. Using a ‘yellow card’ in the objective structured clinical exam: does it add to the identification of problem postgraduate trainees in general practice: an exploratory study to identify high risk trainees. J Gen Pract. 2013;2:1000134.

    Article  Google Scholar 

  12. Westfall J, Zittleman L, Staton E, Parnes B, Smith P, Niebauer L, Fernald D, Quintela J, Van VR, Dickinson L, Pace W. Card studies for observational research in practice. Ann Fam Med. 2011;9:63–8.

    Article  Google Scholar 

  13. Groves M, Mitchell M, Henderson A, Jeffry C, Kelly M, Nutty D. Critical factors about feedback: ‘they told me what I did wrong; but didn’t give me any feedback’. J Clin Nurs. 2015;24:1737–9.

    Article  Google Scholar 

  14. London M. The power of feedback. Giving, seeking, and using feedback for performance improvement. New York: Routledge; 2015.

    Google Scholar 

  15. Price M, Handley K, Millar J. Feedback: focusing attention on engagement. HES. 2011;36:879–96.

    Google Scholar 

  16. Nicol D, Thomson A, Breslin C. Rethinking feedback practices in higher education: a peer review perspective. Assess Eval High Edu. 2013;39:102–22.

    Article  Google Scholar 

  17. Wiliam D. Feedback: Part of a system. Educ Leadersh. 2012;70:30–4.

    Google Scholar 

  18. Harrison CJ, Molyneux AJ, Blackwell S, Wass VJ. How we give personalised audio feedback after summative OSCEs. Med Teach. 2015;37:323–6.

    Article  Google Scholar 

  19. Doherty C, Kettle M, May L, Caukil E. Talking the talk: oracy demands in first year university assessment tasks. Assess Educ. 2011;18:27–39.

    Article  Google Scholar 

  20. Doherty EM, Nugent E. Personality factors and medical training: a review of the literature. Med Educ. 2011;45:132–40.

    Article  Google Scholar 

  21. Lloyd M, Watmough S, O’Brien S, Furlong N, Hardy K. Formalized prescribing error feedback from hospital pharmacists: doctors’ attitudes and opinions. Br J Hosp Med. 2015;2:713–8.

    Article  Google Scholar 

  22. Shin SJ, Kim KS, Lee DS. The effect of personal character on the results of clinical performance skill tests. Korean J Med Educ. 2011;23:111–7.

    Article  Google Scholar 

  23. Harrison CJ, Molyneux SB, Blackwell S, Wass VJ. How we give personalised audio feedback after summative OSCEs. Med Teach. 2015;37(4):323–6.

    Article  Google Scholar 

  24. Whitman NA, Schwenk TL. The Physician as Teacher. USA: Whitman Associates; 1997.

    Google Scholar 

  25. Mavis BE, Wagner DP, Henry RC, Carravallah L, Gold J, Maurer J, Mohmand A, Osuch J, Roskos S, Saxe A, Sousa A, Prins VW. Documenting clinical performance problems among medical students: feedback for learner remediation and curriculum enhancement. Med Educ Online. 2013;18:20598.

    Article  Google Scholar 

  26. Bautista JMD, Manalastas REC. Using video recording in evaluating students’ clinical skills. Med Sci Educ. 2017;27(4):645–50.

    Article  Google Scholar 

  27. Nasir AA, Yusuf AS, Abdur-Rahman LO, Babalola OM, Adeyeye AA, Popoola AA, Adeniran JO. Medical students’ perception of objective structured clinical examination: a feedback for process improvement. J Surg Educ. 2014;71(5):701–6.

    Article  Google Scholar 

  28. Lawshe CH. A quantitative approach to content validity. Pers Psychol. 1975;28(4):563–75.

    Article  Google Scholar 

  29. McGrath RE, Meyer GJ. When effect sizes disagree: the case of r and d. Psychol Methods. 2006;11(4):386–401.

    Article  Google Scholar 

  30. Pinsky LE, Wipf JE. A picture is worth a thousand words: practical use of videotape in teaching. J Gen Intern Med. 2000;15(11):805–10.

    Article  Google Scholar 

  31. Pulman A, Scammell J, Martin M. Enabling interprofessional education: the role of technology to enhance learning. Nurse Educ Today. 2009;29:232–9.

    Article  Google Scholar 

  32. Lindon-Morris E, Laidlaw A. Anxiety and self-awareness in video feedback. Clin Teach. 2014;11:174–8.

    Article  Google Scholar 

  33. Noordman J, van der Weijden T, van Dulmen S. Effects of video-feedback on the communication, clinical competence and motivational interviewing skills of practice nurses: a pre-test posttest control group study. J Adv Nurs. 2014;70:2272–83.

    Article  Google Scholar 

  34. Chae YJ, Ha YM. Effectiveness of education program for core fundamental nursing skills using recording video with smartphone and formative feedback. J Digit Convergence. 2016;14:285–94.

    Article  Google Scholar 

  35. Lee SG, Shin YH. Effects of self-directed feedback practice using smartphone videos on basic nursing skills, confidence in performance and learning satisfaction. J Korean Acad Nurs. 2016;46:283–92.

    Article  Google Scholar 

  36. Schmidt RA, Lee TD. Motor control and learning: a behavioral emphasis. 5th ed. Champaign: Human Kinetics; 2012.

    Google Scholar 

  37. Janelle CM, Barba DA, Frehlich SG, Tennant LK, Cauraugh JH. Maximizing performance feedback effectiveness through videotape replay and a self-controlled learning environment. Res Q Exerc Sport. 1997;68:269–79.

    Article  Google Scholar 

  38. Hodges NJ, Chua R, Franks IM. The role of video in facilitating perception and action of a novel coordination movement. J Mot Behav. 2003;35:247–60.

    Article  Google Scholar 

Download references

Acknowledgements

None.

Funding

This study was supported by a Biomedical Research Institute Grant (2016–15) from Pusan National University Hospital. This funding source had no role in the design of the study and collection, analysis, and interpretation of data and in writing the manuscript.

Author information

Authors and Affiliations

Authors

Contributions

BSK, SYJ and SYL conceptualized the study, developed the proposal, coordinated the project, completed initial data entry and analysis, and wrote the report. BSK and SYJ conducted the statistical analyses. SJY, SYB and SJI assisted in writing and editing the final report. SYL participated in overall supervision of the project and revision of the report. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Sang Yeoup Lee.

Ethics declarations

Ethics approval and consent to participate

This study was reviewed and given exempt status by the Institutional Review Board of Pusan National University Yangsan Hospital (IRB No. 05–2017-102). Informed consent was not required as the study analyzed only pre-existing de-identified data. After institutional review board approval, we obtained the data use agreement as required with the Medical Education Unit, Pusan National University School of Medicine.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations./Para>

Additional file

Additional file 1:

The questionnaire used in the present study. Questionnaire S1. Students’ perceptions regarding their clinical performance assessment before viewing video feedback. Questionnaire S2. Students’ perceptions regarding their clinical performance assessment after viewing video feedback. (DOCX 21 kb)

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kam, B., Yune, S., Lee, S. et al. Impact of video feedback system on medical students’ perception of their clinical performance assessment. BMC Med Educ 19, 252 (2019). https://doi.org/10.1186/s12909-019-1688-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12909-019-1688-6

Keywords