Skip to main content

Self-assessment scale for the community-based and emergency practice

Abstract

Background

For current medical education, community-based primary care for the elderly is an essential topic. This study aimed to establish a scale of community-based assessment for clinical and emergency practice (C-CEP).

Methods

A self-assessment scale for C-CEP was developed according to four steps. Initially, we reviewed publications from the societies of the United States, British, and Japan regarding educational goals. In addition, we searched MEDLINE for educational goals regarding attitude, skills, and knowledge. Getting together, we established 23 items as the educational goals of the C-CEP. Second, we collected responses for these 23 items from 5th-grade medical students (n = 195). Third, we conducted an exploratory factor analysis (EFA) using their responses to determine the fundamental structure of the self-assessment scale. Finally, a confirmatory factor analysis (CFA) was performed to assess the fitness of the self-assessment scale developing the EFA, resulting in modification of the items.

Results

In EFA and CFA results, C-CEP Scale consisted of four factors with 15 items: “Attitude and communication in emergency care,” Basic clinical skills,” “Knowledge of community healthcare,“ and “Knowledge of evidence-based medicine perseverance.” The model fit indices were acceptable (Goodness of Fix Index = 0.928, Adjusted Goodness of Fit Index = 0.900, Comparative Fit Index = 0.979, and Root Mean Square Error of Approximation = 0.045). The values of McDonald’s omega as an estimate of scale reliability were more than 0.7 in all four factors. As for test-retest reliability, the intraclass correlation coefficients were ≥ 0.58 for all factors. All four factors of the C-CEP Scale correlated positively with the Medical Professionalism Evaluation Scale subscales.

Conclusions

We developed a valid and reliable self-assessment scale to assess student competence.

Peer Review reports

Introduction

The world’s aging rate (aged 65 and older) has increased from 5.1% in 1950 to 9.0% in 2020. The United States (US) rate is estimated to elevate to 20% by 2040. For Japan, in 2020 had been about 20% already, but in many remote Japanese areas, the rate is over 40%.

Medicine’s goals in areas with many elderly residents are broad, diverse, and complex. Current outcomes of medical education in primary care are often designed based on these goals. However, it is pointed out that education designed from such complex outcomes can often increase educators’ burdens. One reason for the increasing burden is that the supervising doctor must teach medical students and residents in remote areas while caring for patients as a primary physician. It is not rare that medical resources, including human, in such rural area is limited.

For this reason, it appears that an efficient tool to educate the learner on primary care for the elderly in the rural community is essential [1,2,3,4,5]. Importantly, as per World Health Organization recommendations [6], primary care physicians who can care for patients comprehensively are also required. To educate medical students on attitudes, skills, and knowledge in the community [7], educators must prepare for community-based medical education (CBME) programs.

Although the effectiveness of the CBME program in training medical students and residents has been shown [8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34], no tool for self-assessment of the competencies in the CBME has been fully developed. For example, small studies using focus group interviews to assess these competencies were published. However, these methods were not simple and might increase the educator’s burden. There is a possibility that preparing enough opportunities to assess medical students by the instructor is difficult in rural areas due to work overload. A well-designed self-assessment method is required in CBME [35,36,37,38,39,40,41,42].

From this point of view, in this study, we aimed to develop a self-assessment scale for Community-based Clinical and Emergency Practice (C-CEP) and to verify its reliability and validity.

Methods

Study design

This study aimed to develop the C-CEP self-assessment scale according to these four steps. Initially, we reviewed publications from the societies of the US, British, and Japan regarding educational goals. In addition, we searched MEDLINE for educational goals regarding attitude, skills, and knowledge. Getting together, we established 23 items as the educational goals of the C-CEP. Second, we collected responses for these 23 items from 5th-grade medical students. Third, we conducted an exploratory factor analysis (EFA) using their responses to determine the fundamental structure of the self-assessment scale. A confirmatory factor analysis (CFA) was performed to assess the fitness of the self-assessment scale developing the EFA, resulting in modifying items without changes in the factor structure. Then, we calculated the McDonald’s omega coefficient was calculated to confirm the scale’s internal reliability. Finally, we investigated the validity and reliability of the self-assessment form compared to another scale. The Ethics Committee approved this study protocol of Sapporo Medical University (SMU) (3-1-58).

The items on the self-assessment scale

To make a C-CEP self-assessment scale, we selected the goals published by domestic academic societies of medical education. This was to reflect the diversifying world educational goals on this self-assessment scale. We reviewed publications from the societies of the US, British, and Japan regarding educational goals; the American Medical Association (AMA) and the Association of American Medical Colleges (AAMC) [43], the British Medical Association (BMA) [44] and the British General Medical Council (GMC) [45]), and the Japanese government, specifically the Ministry of Education, Culture, Sports, Science and Technology (MEXT) [46] and the Japanese government, specifically Ministry of Health, Labour and Welfare (MHLW) [47]) (Supplementary1-6). Moreover, we reviewed MEDLINE from 2008 to 2020 using the following six keywords: medical education, community-based medical education, community-oriented medical education, emergency medicine, medical education, and primary care. According to the MEDLINE search and the reviews of international educational goals shown above, we established 23 items as the educational goals of C-CEP.

The responses for these 23 items were categorized based on the 5-grade self-assessment (5; strongly agree, 4; relatively agree, 3; equivocal, 2; relatively disagree, and 1; strongly disagree) and were recorded according to each student.

Surveillance for medical clinical clerkship students

We conducted a cross-sectional survey of clinical clerkship and 5th-grade medical students from SMU between 2015 and 2016. All of these students participated in a two-week community-based clinical medicine educational program. The students assessed themselves using the 23 items self-assessment form before the start of the clerkship program.

Exploratory factor analysis for the fundamental constructs of the question items and confirmatory factor analysis for the goodness of fit

We used the minimum average partial test (MAP) and Bayesian Information Criterion (BIC) to determine the number of factors. Then, EFA using the principal factor method with Promax rotation, was performed to clarify the underlying structures of factors. We excluded items for which all factor loadings were < 0.3. When there was multiple factor loading > 0.4, or when factor loading was > 0.4 for some and > 0.3 for others, the items were eliminated sequentially. The EFA was conducted to determine the factor structure; when factor loadings for all four factors were less than 0.35, the items were eliminated sequentially, and the factor analysis was repeated. A factor was defined as having at least three items.

For the factors extracted based on EFA, we adopted the quadratic factor analysis model because of its clarity of interpretation. We hypothesized that the factors identified by EFA would explain the upper factor, “Community-based Clinical and Emergency Practice,” in students who participated in CBME program at SMU. A CFA using covariance structure analysis was performed to verify the construct validity of the created scale using responses from 5th-grade students. Based on the theoretical model, the Goodness of Fix Index (GFI) and Adjusted Goodness of Fit Index (AGFI) are used as the goodness-of-fit indices. The Comparative Fit Index (CFI) and Root Mean Square Error of Approximation (RMSEA) were used as the criterion comparison indices. Without changes in factor structure, items were deleted as appropriate to improve the model based on the results. Model refinement was completed when the model had the highest goodness of fit and met the criterion comparison index. After the model was refined, McDonald’s omega coefficient was calculated to check the internal consistency of the constructed scale.

Validity and reliability of the self-assessment scale

Surveillance was conducted in 2016 with 4th-grade medical students at SMU to validate the reliability of the C-CEP scale. Surveillance for the students, who participated in 2 weeks of the C-CEP program, was performed before and after the program. First, the scores of factors extracted by CFA were obtained from the pre-and post-questionnaires. Second, inter-rater reliability with intraclass correlation coefficient (ICC) was assessed to assess test-retest reliability. To validate reliability (Supplement 7), we compared the scores from our C-CEP at the before-program to those from the Medical Professional Evaluation Scale (MPES) [48]. To compare results to C-CEP, the four factors of the MPES, “collaboration,” “providing safe, quality care,” “reflective practice,” and “interest in community health,” were selected. We analyzed correlations between scores from C-CEP scale and those from these four MPES factors.

Ethics

Before the students completed the questionnaires, we explained that their grades and credits earned were not affected by whether or not they participated. The written informed consent, which stated that the survey was anonymous and voluntary, and all data were deleted after use in this research, was obtained. We numbered the questionnaires and linked the records when examining the test-retest reliability. After that, the results were anonymized.

Statistical analysis

Descriptive statistics are presented as mean ± standard deviation (SD). A p-value of < 0.05 was deemed to indicate statistical significance. A certain amount of data is necessary to obtain reliable results in CFA [49]. Generally, the minimum number of subjects to ensure the stability of the variance-covariance matrix is 100, with 4 to 10 subjects per variable. We set the number of subjects per item at 7. Since our questionnaire consisted of 23 items, we aimed for a minimum of 161 participants. All statistical analyses were performed using SPSS Statistics (IBM SPSS Statistics for Windows, Version 22.0, IBM Corp., Armonk, NY) and R (version 4.2.1).

Results

The items on the self-assessment scale

The 23 items of C-CEP (Table 1) were fully matched to the medical students’ core curriculum goals and objectives for Japanese residents (MEXT and MHLW). Meanwhile, comparing these 23 items to the goals/objectives of medical students and residents in the US, 87.0% (20/23) of medical students’ goals and 100% of residents’ objectives were matched. Compared to the British, 100% of medical students’ goals and 21.7% (5/23) of residents’ objectives were matched.

Table 1 Self-assessment items on questionnaire and reference

Surveillance for medical clinical clerkship students

A cross-sectional survey of 237 5th-grade medical students at SMU was conducted between 2015 and 2016. Of these 237 patients, 42 were excluded due to lack of data, and 195 were included in the analysis. The means and standard deviations for each item, together with the item-total correlation analysis, are shown in Table 2. None of the items did show ceiling and/or floor effects.

Table 2 The results of exploratory factor analysis for self-assessment in pre-clinical clerkship program

Exploratory factor analysis for the fundamental constructs of the question items and confirmatory factor analysis for the goodness of fit

MAP and BIC suggested three and six factors should be retained, respectively. Therefore, the four- and five-factors solutions were sequentially examined. According to EFA for self-assessment results from students who participated in the pre-clinical clerkship program (Table 2), items 4, 12, and 15 were eliminated. Thereby, we extracted four factors consisting of 20 items from the pre-analysis, and each factor had at least three items, and no factors were deleted. This resulted in a quadratic factor model consisting of four factors.

We then conducted a CFA on the 20-item model generated from the EFA, using the results of our 5th-grade students. However, among the goodness-of-fit indices, CFI was low at 0.927, and RMSEA was high at 0.073. Therefore, the goodness-of-fit indices were not met when 20 items were used as latent variables. After further analysis, items 6, 8, 11, 13, and 17 were eliminated. We extracted four factors comprising 15 items from the pre-analysis. The results were significant for all coefficients (standardized estimates) at the 5% level. In terms of goodness of fit indices, GFI = 0.928, AGFI = 0.900, CFI = 0.979, and RMSEA = 0.045, indicating a satisfactory fit. All factors showed good coefficients with the upper model (Fig. 1). McDonald’s omega coefficients were 0.933 for “Attitude and communication in emergency care” (3 items), 0.832 for “Basic clinical skills” (4 items), 0.864 for “ Knowledge of Community healthcare” (5 items), and 0.700 for “ Knowledge of evidence-based medicine” (3 items).

Fig. 1
figure 1

Path diagram of the C-CEP scale (after confirmatory factor analysis; CFA). The numbers surrounded by the dot-line in Fig. 1 represent standardized estimates. All values are p < 0.001. Abbreviation; e; error term, q; question item GFI; Goodness of fit index, AGFI; Adjusted goodness of fit index, RMSEA; Root mean square error of approximation

Validation of the reliability of the self-assessment scale

Of the total 4th-grade medical students (n = 113), 79 were included in this study. Due to disagreement about participating in this study, 17 were excluded. According to the analysis of inter-rater reliability validation to assess test-retest reliability, “Attitude and communication in emergency care,” “Basic clinical skills,” and “ Knowledge of Community healthcare” showed substantial reliability; ICC (1,1) = 0.712, 0.659, 0.631, p < 0.001, respectively. Only “Knowledge of medical care regarding emergency room” showed moderate reliability; ICC (1.1) = 0.589, p < 0.001. Finally, we compared the factors of our scale to those of the MPES (Table 3). All of the factors developed correlated with each of the MPES items (p < 0.001). The high correlations (r > 0.70) were shown between “Knowledge of community healthcare” or “ Knowledge of evidence-based medicine” in C-CEP and “Interest in community health” in MEPS, “Basic clinical skills” in C-CEP and “providing safe, quality care” in MEPS, and “ Knowledge of evidence-based medicine” in C-CEP and “Interest in community health” in MEPS.

Table 3 Correlation between C-CEP Scale and MEPS

Discussion

We developed a C-CEP scale as a novel tool for the self-assessment of community-based clinical practice. Using EFA and CFA, a 15-item questionnaire was developed to assess its internal reliability, retest reliability, and criterion-related validity. These results showed that students recognized the learning model of “Community-based Clinical and Emergency Practice” consisted of “Attitude and communication in emergency care,” “Basic clinical skills,” “Knowledge of Community healthcare,” and “Knowledge of evidence-based medicine” as essential competencies of their training. Since this scale is based on self-assessment with high validity and reliability, a reduced educator burden in remote medicine will be expected.

These four factors were considered to have certain reliability for self-assessment in community health care education, according to the results of CFA and comparisons with the MPES. In remote areas, educational resources are limited. In rural medicine education, self-learning tools are essential; e-learning and the Internet seem helpful. Notably, self-assessment experiences such as our study are helpful for the learners, who can train themselves to see a bird’s eye view, resulting in autonomous development. For students to acquire the attitude for growth in lifelong, it would be desirable to have the experience of self-assessment using a reliable device, such as our self-assessment scale.

MPES, comparable to C-CEP, is the self-assessment scale to assess the general capability of a medical doctor and professionalism. It consisted of 30 items from 7 factors for assessing students before clinical practice. The construct validity, criterion-related validity, and reliability of the MPES were generally confirmed and widely accepted. It is noteworthy that the C-CEP scale focuses on community-based emergency care capacity. This self-assessment scale correlated with the MPES because there is a partial overlap in the underlying competencies. We believe that using the two differently will have a higher educational effect.

This study has several limitations. Initially, the placements only lasted 2 weeks. Therefore, the readiness of motivation or fundamental knowledge of the learner might have affected the results. Second, the target group was also limited to a cohort of students at SMU, so it will be necessary to investigate whether the C-CEP Scale can be used in different cultures or languages. Third, the long outcome could not be assessed due to the short duration of the observation in this study. Finally, it is unclear if it can use this C-CEP when a pandemic like COVID-19 arises.

Conclusion

The C-CEP Scale comprises 15 items that cover four factors and is both valid and reliable. This scale would help clerkship education and may also be used to improve its curricula.

Availability of data and materials

The datasets used and/or analyzed during the current study are available from the corresponding author on reasonable request.

Abbreviations

CBME:

Community-based medical education

C-CEP:

Community-based Clinical and Emergency Practice

EFA:

Exploratory factor analysis

CFA:

Confirmatory factor analysis

AMA:

American Medical Association

AAMC:

Association of American Medical Colleges

BMA:

British Medical Association

GMC:

General Medical Council

MEXT:

Ministry of Education, Culture, Sports, Science and Technology

MHLW:

Ministry of Health, Labour and Welfare

SMU:

Sapporo Medical University

GFI:

Goodness of Fix index

AGFI:

Adjusted Goodness of Fit index

CFI:

Comparative Fit Index

RMSEA:

Root Mean Square Error of Approximation

ICC:

Intraclass correlation coefficient

MPES:

Medical Professional Evaluation Scale

SD:

Standard deviation

References

  1. Koike S, Matsumoto S, Kodama T, Ide H, Yasunaga H, Imamura T. Specialty choice and physicians' career paths in Japan: an analysis of National Physician Survey data from 1996 to 2006. Health Policy (Amsterdam, Netherlands). 2010;98:236–44.

    Article  Google Scholar 

  2. Matsumoto M, Inoue K, Bowman R, Noguchi S, Toyokawa S, Kajii E. Geographical distributions of physicians in Japan and US: impact of healthcare system on physician dispersal pattern. Health Policy (Amsterdam, Netherlands). 2010;96:255–61.

    Article  Google Scholar 

  3. Anand S, Fan VY, Zhang J, Zhang L, Ke Y, Dong Z, et al. China's human resources for health: quantity, quality, and distribution. Lancet (London, England). 2008;372:1774–81.

    Article  Google Scholar 

  4. Morris S, Sutton M, Gravelle H. Inequity and inequality in the use of health care in England: an empirical investigation. Soc Sci Med. 1982;2005(60):1251–66.

    Google Scholar 

  5. Isabel C, Paula V. Geographic distribution of physicians in Portugal. Eur J Health Econ. 2010;11:383–93.

    Article  Google Scholar 

  6. WHO Guidelines Approved by the Guidelines Review Committee. Increasing Access to Health Workers in Remote and Rural Areas Through Improved Retention: Global Policy Recommendations. Geneva: World Health Organization; 2010. http://apps.who.int/iris/bitstream/10665/44369/1/9789241564014_eng.pdf. Accessed 11 July 2022

    Google Scholar 

  7. Ash JK, Walters LK, Prideaux DJ, Wilson IG. The context of clinical teaching and learning in Australia. Med J Aust. 2012;196:475.

    Article  Google Scholar 

  8. Peabody C, Block A, Jain S. Multi-disciplinary service learning: a medico-legal collaboration. Med Educ. 2008;42:533–4.

    Article  Google Scholar 

  9. Walmsley L, Fortune M, Brown A. Experiential interprofessional education for medical students at a regional medical campus. Can Med Educ J. 2018;9:e59–67.

    Article  Google Scholar 

  10. Asakawa T, Kawabata H, Kisa K, Terashita T, Murakami M, Otaki J. Establishing community-based integrated care for elderly patients through interprofessional teamwork: a qualitative analysis. J Multidiscip Healthc. 2017;10:399–407.

    Article  Google Scholar 

  11. Somporn P, Walters L, Ash J. Expectations of rural community-based medical education: a case study from Thailand. Rural Remote Health. 2018;18:4709.

    Google Scholar 

  12. Watmough S, Cherry MG, O'Sullivan H. A comparison of self-perceived competencies of traditional and reformed curriculum graduates 6 years after graduation. Med Teach. 2012;34:562–8.

    Article  Google Scholar 

  13. Watmough S. An evaluation of the impact of an increase in community-based medical undergraduate education in a UK medical school. Educ Prim Care. 2012;23:385–90.

    Article  Google Scholar 

  14. Florence JA, Goodrow B, Wachs J, Grover S, Olive KE. Rural health professions education at East Tennessee State University: survey of graduates from the first decade of the community partnership program. J Rural Health. 2007;23:77–83.

    Article  Google Scholar 

  15. Morgan S, Smedts A, Campbell N, Sager R, Lowe M, Strasser S. From the bush to the big smoke--development of a hybrid urban community based medical education program in the Northern Territory, Australia. Rural Remote Health. 2009;9:1175.

    Google Scholar 

  16. Fletcher S, Mullett J, Beerman S. Value of a regional family practice residency training program site: perceptions of residents, nurses, and physicians. Can Fam Physician. 2014;60:e447–54.

    Google Scholar 

  17. Walters L, Seal A, McGirr J, Stewart R, DeWitt D, Playford D. Effect of medical student preference on rural clinical school experience and rural career intentions. Rural Remote Health. 2016;16:3698.

    Google Scholar 

  18. Jones A, McArdle PJ, O'Neill PA. Perceptions of how well graduates are prepared for the role of pre-registration house officer: a comparison of outcomes from a traditional and an integrated PBL curriculum. Med Educ. 2002;36:16–25.

    Article  Google Scholar 

  19. Anderson ES, Lennox AI, Petersen SA. Learning from lives: a model for health and social care education in the wider community context. Med Educ. 2003;37:59–68.

    Article  Google Scholar 

  20. Sinclair HK, Ritchie LD, Lee AJ. A future career in general practice? A longitudinal study of medical students and pre-registration house officers. Eur J Gen Pract. 2006;12:120–7.

    Article  Google Scholar 

  21. Teherani A, Irby DM, Loeser H. Outcomes of different clerkship models: longitudinal integrated, hybrid, and block. Acad Med. 2013;88:35–43.

    Article  Google Scholar 

  22. Tanaka K, Son D. Experiential learning for junior residents as a part of community-based medical education in Japan. Education for primary care. Educ Prim Care. 2019;30:282-8.

  23. Daly M, Perkins D, Kumar K, Roberts C, Moore M. What factors in rural and remote extended clinical placements may contribute to preparedness for practice from the perspective of students and clinicians? Med Teach. 2013;35:900–7.

    Article  Google Scholar 

  24. Denz-Penhey H, Campbell MJ. Rural learning is more than marks: sensitised to knowledge. Med Teach. 2008;30:781–6.

    Article  Google Scholar 

  25. Christner JG, Dallaghan GB, Briscoe G, Casey P, Fincher RM, Manfred LM, et al. The community preceptor crisis: recruiting and retaining community-based faculty to teach medical students-a shared perspective from the Alliance for clinical education. Teach Learn Med. 2016;28:329–36.

    Article  Google Scholar 

  26. Mader EM, Roseamelia CA, Lewis SL, Arthur ME, Reed E, Germain LJ. Clinical training in the rural setting: using photovoice to understand student experiences. Rural Remote Health. 2016;16:3877.

    Google Scholar 

  27. Wolff M, Young S, Maurana C. A senior elective: promoting health in underserved communities. Fam Med. 2001;33:732–3.

    Google Scholar 

  28. Glasser M, Stearns J, McCord R. Defining a generalist education: an idea whose time is still coming. Acad Med. 1995;70:S69–74.

    Article  Google Scholar 

  29. Summerlin HH Jr, Landis SE, Olson PR. A community-oriented primary care experience for medical students and family practice residents. Fam Med. 1993;25:95–9.

    Google Scholar 

  30. Heestand Skinner DE, Onoka CA, Ofoebgu EN. Community-based education in Nigerian medical schools: students' perspectives. Educ Health (Abingdon, England). 2008;21:83.

    Google Scholar 

  31. Kaufman A. Rurally based education: confronting social forces underlying ill health. Acad Med. 1990;65:S18–21.

    Article  Google Scholar 

  32. Takamura A, Misaki H, Takemura Y. Community and Interns' perspectives on community-participatory medical education: from passive to active participation. Fam Med. 2017;49:507–13.

    Google Scholar 

  33. Worley P. Relationships: a new way to analyse community-based medical education? (part one). Educ Health (Abingdon, England). 2002;15:117–28.

    Article  Google Scholar 

  34. Noordam AC, Barbera Lainez Y, Sadruddin S, van Heck PM, Chono AO, Acaye GL, et al. The use of counting beads to improve the classification of fast breathing in low-resource settings: a multi-country review. Health Policy Plan. 2015;30:696–704.

    Article  Google Scholar 

  35. Lee SW, Clement N, Tang N, Atiomo W. The current provision of community-based teaching in UK medical schools: an online survey and systematic review. BMJ Open. 2014;4:e005696.

    Article  Google Scholar 

  36. Bowman RC, Penrod JD. Family practice residency programs and the graduation of rural family physicians. Fam Med. 1998;30:288–92.

    Google Scholar 

  37. Pathman DE, Steiner BD, Jones BD, Konrad TR. Preparing and retaining rural physicians through medical education. Acad Med. 1999;74:810–20.

    Article  Google Scholar 

  38. Somporn P, Ash J, Walters L. Stakeholder views of rural community-based medical education: a narrative review of the international literature. Med Educ. 2018;52:791–802.

    Article  Google Scholar 

  39. Thistlethwaite JE, Jordan JJ. Patient-centred consultations: a comparison of student experience and understanding in two clinical environments. Med Educ. 1999;33:678–85.

    Article  Google Scholar 

  40. O'Sullivan M, Martin J, Murray E. Students' perceptions of the relative advantages and disadvantages of community-based and hospital-based teaching: a qualitative study. Med Educ. 2000;34:648–55.

    Article  Google Scholar 

  41. Hillier M, McLeod S, Mendelsohn D, Moffat B, Smallfield A, Arab A, et al. Emergency medicine training in Canada: a survey of medical students' knowledge, attitudes, and preferences. CJME. 2011;13(251-8):e18–27.

    Google Scholar 

  42. Elam CL, Sauer MJ, Stratton TD, Skelton J, Crocker D, Musick DW. Service learning in the medical curriculum: developing and evaluating an elective experience. Teach Learn Med. 2003;15:194–203.

    Article  Google Scholar 

  43. Recommendations for Clinical Skills Curricula for Undergraduate Medical Education. https://www.stfm.org/media/1363/clinicalskills_oct09qxdpdf.pdf accessed at June24, 2022.

  44. The duties of a doctor registered with the General Medical Council 2016, https://www.gmc-uk.org/ethical-guidance/ethical-guidance-for-doctors/good-medical-practice/duties-of-a-doctor accessed at June24, 2022.

  45. General Medical Council for graduates:Practical skills and procedures-practical 2019, https://www.gmc-uk.org/-/media/documents/practical-skills-and-procedures-a4_pdf-78058950.pdf accessed at June24, 2022.

  46. Model Core Curriculum for Medical Education in Japan 2016, https://www.mext.go.jp/component/a_menu/education/detail/__icsFiles/afieldfile/2018/06/18/1325989_30.pdf accessed at June24, 2022.

  47. Basic qualities and abilities required of a physician 2020, https://www.mhlw.go.jp/content/10800000/000719078.pdf accessed at June24, 2022.

  48. Yamamoto T, Kawaguchi A, Otsuka Y. Developing the comprehensive medical professionalism assessment scale. MedEdPublish. 2019:1–15 https://mededpublish.org/articles/8-91 Accessed at 24 June 2022.

  49. Floyd FJ, Widaman KF. Factor analysis in the development and refinement of clinical assessment instruments. Psychol Assess. 1995;7:286–99.

    Article  Google Scholar 

Download references

Acknowledgements

Not applicable.

Funding

None.

Author information

Authors and Affiliations

Authors

Contributions

TW, HS and YT conceived the idea of the study. TY developed the statistical analysis plan and conducted statistical analyses. TY and WY contributed to the interpretation of the results. TW drafted the original manuscript. HS and YT supervised the conduct of this study. All authors reviewed the manuscript draft and revised it critically on intellectual content. All authors approved the final version of the manuscript to be published.

Corresponding author

Correspondence to Yoshihisa Tsuji.

Ethics declarations

Ethics approval and consent to participate

Research was performed in accordance with the Declaration of Helsinki and was approved by the Ethics Committee approved this study protocol of Sapporo Medical University (3-1-58), and informed consent was obtained from all participants in the study.

Consent for publication

N/A

Competing interests

None.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1: Supplementary 1.

Recommendations for Clinical Skills Curricula for Undergraduate Medical Education [43]. Supplementary 2. The duties of a doctor registered with the General Medical Council 2016, UK [44]. Supplementary 3. General Medical Council for graduates 2018 [45]. Supplementary 4. General Medical Council for graduates: Practical skills and procedures-practical 2019 [45]. Supplementary 5. Model Core Curriculum for Medical Education in Japan 2016 [46]. Supplementary 6. Basic qualities and abilities required of a physician 2020 [47]. Supplementary 7. Medical professional evaluation scale [48]

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Wakabayashi, T., Tsuji, Y., Yamamoto, T. et al. Self-assessment scale for the community-based and emergency practice. BMC Med Educ 22, 799 (2022). https://doi.org/10.1186/s12909-022-03848-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12909-022-03848-z

Keywords

  • Clinical clerkship
  • Community-based
  • Emergency practice
  • Primary-care
  • Self-assessment