Self-assessment scale for the community-based and emergency practice
BMC Medical Education volume 22, Article number: 799 (2022)
For current medical education, community-based primary care for the elderly is an essential topic. This study aimed to establish a scale of community-based assessment for clinical and emergency practice (C-CEP).
A self-assessment scale for C-CEP was developed according to four steps. Initially, we reviewed publications from the societies of the United States, British, and Japan regarding educational goals. In addition, we searched MEDLINE for educational goals regarding attitude, skills, and knowledge. Getting together, we established 23 items as the educational goals of the C-CEP. Second, we collected responses for these 23 items from 5th-grade medical students (n = 195). Third, we conducted an exploratory factor analysis (EFA) using their responses to determine the fundamental structure of the self-assessment scale. Finally, a confirmatory factor analysis (CFA) was performed to assess the fitness of the self-assessment scale developing the EFA, resulting in modification of the items.
In EFA and CFA results, C-CEP Scale consisted of four factors with 15 items: “Attitude and communication in emergency care,” Basic clinical skills,” “Knowledge of community healthcare,“ and “Knowledge of evidence-based medicine perseverance.” The model fit indices were acceptable (Goodness of Fix Index = 0.928, Adjusted Goodness of Fit Index = 0.900, Comparative Fit Index = 0.979, and Root Mean Square Error of Approximation = 0.045). The values of McDonald’s omega as an estimate of scale reliability were more than 0.7 in all four factors. As for test-retest reliability, the intraclass correlation coefficients were ≥ 0.58 for all factors. All four factors of the C-CEP Scale correlated positively with the Medical Professionalism Evaluation Scale subscales.
We developed a valid and reliable self-assessment scale to assess student competence.
The world’s aging rate (aged 65 and older) has increased from 5.1% in 1950 to 9.0% in 2020. The United States (US) rate is estimated to elevate to 20% by 2040. For Japan, in 2020 had been about 20% already, but in many remote Japanese areas, the rate is over 40%.
Medicine’s goals in areas with many elderly residents are broad, diverse, and complex. Current outcomes of medical education in primary care are often designed based on these goals. However, it is pointed out that education designed from such complex outcomes can often increase educators’ burdens. One reason for the increasing burden is that the supervising doctor must teach medical students and residents in remote areas while caring for patients as a primary physician. It is not rare that medical resources, including human, in such rural area is limited.
For this reason, it appears that an efficient tool to educate the learner on primary care for the elderly in the rural community is essential [1,2,3,4,5]. Importantly, as per World Health Organization recommendations , primary care physicians who can care for patients comprehensively are also required. To educate medical students on attitudes, skills, and knowledge in the community , educators must prepare for community-based medical education (CBME) programs.
Although the effectiveness of the CBME program in training medical students and residents has been shown [8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34], no tool for self-assessment of the competencies in the CBME has been fully developed. For example, small studies using focus group interviews to assess these competencies were published. However, these methods were not simple and might increase the educator’s burden. There is a possibility that preparing enough opportunities to assess medical students by the instructor is difficult in rural areas due to work overload. A well-designed self-assessment method is required in CBME [35,36,37,38,39,40,41,42].
From this point of view, in this study, we aimed to develop a self-assessment scale for Community-based Clinical and Emergency Practice (C-CEP) and to verify its reliability and validity.
This study aimed to develop the C-CEP self-assessment scale according to these four steps. Initially, we reviewed publications from the societies of the US, British, and Japan regarding educational goals. In addition, we searched MEDLINE for educational goals regarding attitude, skills, and knowledge. Getting together, we established 23 items as the educational goals of the C-CEP. Second, we collected responses for these 23 items from 5th-grade medical students. Third, we conducted an exploratory factor analysis (EFA) using their responses to determine the fundamental structure of the self-assessment scale. A confirmatory factor analysis (CFA) was performed to assess the fitness of the self-assessment scale developing the EFA, resulting in modifying items without changes in the factor structure. Then, we calculated the McDonald’s omega coefficient was calculated to confirm the scale’s internal reliability. Finally, we investigated the validity and reliability of the self-assessment form compared to another scale. The Ethics Committee approved this study protocol of Sapporo Medical University (SMU) (3-1-58).
The items on the self-assessment scale
To make a C-CEP self-assessment scale, we selected the goals published by domestic academic societies of medical education. This was to reflect the diversifying world educational goals on this self-assessment scale. We reviewed publications from the societies of the US, British, and Japan regarding educational goals; the American Medical Association (AMA) and the Association of American Medical Colleges (AAMC) , the British Medical Association (BMA)  and the British General Medical Council (GMC) ), and the Japanese government, specifically the Ministry of Education, Culture, Sports, Science and Technology (MEXT)  and the Japanese government, specifically Ministry of Health, Labour and Welfare (MHLW) ) (Supplementary1-6). Moreover, we reviewed MEDLINE from 2008 to 2020 using the following six keywords: medical education, community-based medical education, community-oriented medical education, emergency medicine, medical education, and primary care. According to the MEDLINE search and the reviews of international educational goals shown above, we established 23 items as the educational goals of C-CEP.
The responses for these 23 items were categorized based on the 5-grade self-assessment (5; strongly agree, 4; relatively agree, 3; equivocal, 2; relatively disagree, and 1; strongly disagree) and were recorded according to each student.
Surveillance for medical clinical clerkship students
We conducted a cross-sectional survey of clinical clerkship and 5th-grade medical students from SMU between 2015 and 2016. All of these students participated in a two-week community-based clinical medicine educational program. The students assessed themselves using the 23 items self-assessment form before the start of the clerkship program.
Exploratory factor analysis for the fundamental constructs of the question items and confirmatory factor analysis for the goodness of fit
We used the minimum average partial test (MAP) and Bayesian Information Criterion (BIC) to determine the number of factors. Then, EFA using the principal factor method with Promax rotation, was performed to clarify the underlying structures of factors. We excluded items for which all factor loadings were < 0.3. When there was multiple factor loading > 0.4, or when factor loading was > 0.4 for some and > 0.3 for others, the items were eliminated sequentially. The EFA was conducted to determine the factor structure; when factor loadings for all four factors were less than 0.35, the items were eliminated sequentially, and the factor analysis was repeated. A factor was defined as having at least three items.
For the factors extracted based on EFA, we adopted the quadratic factor analysis model because of its clarity of interpretation. We hypothesized that the factors identified by EFA would explain the upper factor, “Community-based Clinical and Emergency Practice,” in students who participated in CBME program at SMU. A CFA using covariance structure analysis was performed to verify the construct validity of the created scale using responses from 5th-grade students. Based on the theoretical model, the Goodness of Fix Index (GFI) and Adjusted Goodness of Fit Index (AGFI) are used as the goodness-of-fit indices. The Comparative Fit Index (CFI) and Root Mean Square Error of Approximation (RMSEA) were used as the criterion comparison indices. Without changes in factor structure, items were deleted as appropriate to improve the model based on the results. Model refinement was completed when the model had the highest goodness of fit and met the criterion comparison index. After the model was refined, McDonald’s omega coefficient was calculated to check the internal consistency of the constructed scale.
Validity and reliability of the self-assessment scale
Surveillance was conducted in 2016 with 4th-grade medical students at SMU to validate the reliability of the C-CEP scale. Surveillance for the students, who participated in 2 weeks of the C-CEP program, was performed before and after the program. First, the scores of factors extracted by CFA were obtained from the pre-and post-questionnaires. Second, inter-rater reliability with intraclass correlation coefficient (ICC) was assessed to assess test-retest reliability. To validate reliability (Supplement 7), we compared the scores from our C-CEP at the before-program to those from the Medical Professional Evaluation Scale (MPES) . To compare results to C-CEP, the four factors of the MPES, “collaboration,” “providing safe, quality care,” “reflective practice,” and “interest in community health,” were selected. We analyzed correlations between scores from C-CEP scale and those from these four MPES factors.
Before the students completed the questionnaires, we explained that their grades and credits earned were not affected by whether or not they participated. The written informed consent, which stated that the survey was anonymous and voluntary, and all data were deleted after use in this research, was obtained. We numbered the questionnaires and linked the records when examining the test-retest reliability. After that, the results were anonymized.
Descriptive statistics are presented as mean ± standard deviation (SD). A p-value of < 0.05 was deemed to indicate statistical significance. A certain amount of data is necessary to obtain reliable results in CFA . Generally, the minimum number of subjects to ensure the stability of the variance-covariance matrix is 100, with 4 to 10 subjects per variable. We set the number of subjects per item at 7. Since our questionnaire consisted of 23 items, we aimed for a minimum of 161 participants. All statistical analyses were performed using SPSS Statistics (IBM SPSS Statistics for Windows, Version 22.0, IBM Corp., Armonk, NY) and R (version 4.2.1).
The items on the self-assessment scale
The 23 items of C-CEP (Table 1) were fully matched to the medical students’ core curriculum goals and objectives for Japanese residents (MEXT and MHLW). Meanwhile, comparing these 23 items to the goals/objectives of medical students and residents in the US, 87.0% (20/23) of medical students’ goals and 100% of residents’ objectives were matched. Compared to the British, 100% of medical students’ goals and 21.7% (5/23) of residents’ objectives were matched.
Surveillance for medical clinical clerkship students
A cross-sectional survey of 237 5th-grade medical students at SMU was conducted between 2015 and 2016. Of these 237 patients, 42 were excluded due to lack of data, and 195 were included in the analysis. The means and standard deviations for each item, together with the item-total correlation analysis, are shown in Table 2. None of the items did show ceiling and/or floor effects.
Exploratory factor analysis for the fundamental constructs of the question items and confirmatory factor analysis for the goodness of fit
MAP and BIC suggested three and six factors should be retained, respectively. Therefore, the four- and five-factors solutions were sequentially examined. According to EFA for self-assessment results from students who participated in the pre-clinical clerkship program (Table 2), items 4, 12, and 15 were eliminated. Thereby, we extracted four factors consisting of 20 items from the pre-analysis, and each factor had at least three items, and no factors were deleted. This resulted in a quadratic factor model consisting of four factors.
We then conducted a CFA on the 20-item model generated from the EFA, using the results of our 5th-grade students. However, among the goodness-of-fit indices, CFI was low at 0.927, and RMSEA was high at 0.073. Therefore, the goodness-of-fit indices were not met when 20 items were used as latent variables. After further analysis, items 6, 8, 11, 13, and 17 were eliminated. We extracted four factors comprising 15 items from the pre-analysis. The results were significant for all coefficients (standardized estimates) at the 5% level. In terms of goodness of fit indices, GFI = 0.928, AGFI = 0.900, CFI = 0.979, and RMSEA = 0.045, indicating a satisfactory fit. All factors showed good coefficients with the upper model (Fig. 1). McDonald’s omega coefficients were 0.933 for “Attitude and communication in emergency care” (3 items), 0.832 for “Basic clinical skills” (4 items), 0.864 for “ Knowledge of Community healthcare” (5 items), and 0.700 for “ Knowledge of evidence-based medicine” (3 items).
Validation of the reliability of the self-assessment scale
Of the total 4th-grade medical students (n = 113), 79 were included in this study. Due to disagreement about participating in this study, 17 were excluded. According to the analysis of inter-rater reliability validation to assess test-retest reliability, “Attitude and communication in emergency care,” “Basic clinical skills,” and “ Knowledge of Community healthcare” showed substantial reliability; ICC (1,1) = 0.712, 0.659, 0.631, p < 0.001, respectively. Only “Knowledge of medical care regarding emergency room” showed moderate reliability; ICC (1.1) = 0.589, p < 0.001. Finally, we compared the factors of our scale to those of the MPES (Table 3). All of the factors developed correlated with each of the MPES items (p < 0.001). The high correlations (r > 0.70) were shown between “Knowledge of community healthcare” or “ Knowledge of evidence-based medicine” in C-CEP and “Interest in community health” in MEPS, “Basic clinical skills” in C-CEP and “providing safe, quality care” in MEPS, and “ Knowledge of evidence-based medicine” in C-CEP and “Interest in community health” in MEPS.
We developed a C-CEP scale as a novel tool for the self-assessment of community-based clinical practice. Using EFA and CFA, a 15-item questionnaire was developed to assess its internal reliability, retest reliability, and criterion-related validity. These results showed that students recognized the learning model of “Community-based Clinical and Emergency Practice” consisted of “Attitude and communication in emergency care,” “Basic clinical skills,” “Knowledge of Community healthcare,” and “Knowledge of evidence-based medicine” as essential competencies of their training. Since this scale is based on self-assessment with high validity and reliability, a reduced educator burden in remote medicine will be expected.
These four factors were considered to have certain reliability for self-assessment in community health care education, according to the results of CFA and comparisons with the MPES. In remote areas, educational resources are limited. In rural medicine education, self-learning tools are essential; e-learning and the Internet seem helpful. Notably, self-assessment experiences such as our study are helpful for the learners, who can train themselves to see a bird’s eye view, resulting in autonomous development. For students to acquire the attitude for growth in lifelong, it would be desirable to have the experience of self-assessment using a reliable device, such as our self-assessment scale.
MPES, comparable to C-CEP, is the self-assessment scale to assess the general capability of a medical doctor and professionalism. It consisted of 30 items from 7 factors for assessing students before clinical practice. The construct validity, criterion-related validity, and reliability of the MPES were generally confirmed and widely accepted. It is noteworthy that the C-CEP scale focuses on community-based emergency care capacity. This self-assessment scale correlated with the MPES because there is a partial overlap in the underlying competencies. We believe that using the two differently will have a higher educational effect.
This study has several limitations. Initially, the placements only lasted 2 weeks. Therefore, the readiness of motivation or fundamental knowledge of the learner might have affected the results. Second, the target group was also limited to a cohort of students at SMU, so it will be necessary to investigate whether the C-CEP Scale can be used in different cultures or languages. Third, the long outcome could not be assessed due to the short duration of the observation in this study. Finally, it is unclear if it can use this C-CEP when a pandemic like COVID-19 arises.
The C-CEP Scale comprises 15 items that cover four factors and is both valid and reliable. This scale would help clerkship education and may also be used to improve its curricula.
Availability of data and materials
The datasets used and/or analyzed during the current study are available from the corresponding author on reasonable request.
Community-based medical education
Community-based Clinical and Emergency Practice
Exploratory factor analysis
Confirmatory factor analysis
American Medical Association
Association of American Medical Colleges
British Medical Association
General Medical Council
Ministry of Education, Culture, Sports, Science and Technology
Ministry of Health, Labour and Welfare
Sapporo Medical University
Goodness of Fix index
Adjusted Goodness of Fit index
Comparative Fit Index
Root Mean Square Error of Approximation
Intraclass correlation coefficient
Medical Professional Evaluation Scale
Koike S, Matsumoto S, Kodama T, Ide H, Yasunaga H, Imamura T. Specialty choice and physicians' career paths in Japan: an analysis of National Physician Survey data from 1996 to 2006. Health Policy (Amsterdam, Netherlands). 2010;98:236–44.
Matsumoto M, Inoue K, Bowman R, Noguchi S, Toyokawa S, Kajii E. Geographical distributions of physicians in Japan and US: impact of healthcare system on physician dispersal pattern. Health Policy (Amsterdam, Netherlands). 2010;96:255–61.
Anand S, Fan VY, Zhang J, Zhang L, Ke Y, Dong Z, et al. China's human resources for health: quantity, quality, and distribution. Lancet (London, England). 2008;372:1774–81.
Morris S, Sutton M, Gravelle H. Inequity and inequality in the use of health care in England: an empirical investigation. Soc Sci Med. 1982;2005(60):1251–66.
Isabel C, Paula V. Geographic distribution of physicians in Portugal. Eur J Health Econ. 2010;11:383–93.
WHO Guidelines Approved by the Guidelines Review Committee. Increasing Access to Health Workers in Remote and Rural Areas Through Improved Retention: Global Policy Recommendations. Geneva: World Health Organization; 2010. http://apps.who.int/iris/bitstream/10665/44369/1/9789241564014_eng.pdf. Accessed 11 July 2022
Ash JK, Walters LK, Prideaux DJ, Wilson IG. The context of clinical teaching and learning in Australia. Med J Aust. 2012;196:475.
Peabody C, Block A, Jain S. Multi-disciplinary service learning: a medico-legal collaboration. Med Educ. 2008;42:533–4.
Walmsley L, Fortune M, Brown A. Experiential interprofessional education for medical students at a regional medical campus. Can Med Educ J. 2018;9:e59–67.
Asakawa T, Kawabata H, Kisa K, Terashita T, Murakami M, Otaki J. Establishing community-based integrated care for elderly patients through interprofessional teamwork: a qualitative analysis. J Multidiscip Healthc. 2017;10:399–407.
Somporn P, Walters L, Ash J. Expectations of rural community-based medical education: a case study from Thailand. Rural Remote Health. 2018;18:4709.
Watmough S, Cherry MG, O'Sullivan H. A comparison of self-perceived competencies of traditional and reformed curriculum graduates 6 years after graduation. Med Teach. 2012;34:562–8.
Watmough S. An evaluation of the impact of an increase in community-based medical undergraduate education in a UK medical school. Educ Prim Care. 2012;23:385–90.
Florence JA, Goodrow B, Wachs J, Grover S, Olive KE. Rural health professions education at East Tennessee State University: survey of graduates from the first decade of the community partnership program. J Rural Health. 2007;23:77–83.
Morgan S, Smedts A, Campbell N, Sager R, Lowe M, Strasser S. From the bush to the big smoke--development of a hybrid urban community based medical education program in the Northern Territory, Australia. Rural Remote Health. 2009;9:1175.
Fletcher S, Mullett J, Beerman S. Value of a regional family practice residency training program site: perceptions of residents, nurses, and physicians. Can Fam Physician. 2014;60:e447–54.
Walters L, Seal A, McGirr J, Stewart R, DeWitt D, Playford D. Effect of medical student preference on rural clinical school experience and rural career intentions. Rural Remote Health. 2016;16:3698.
Jones A, McArdle PJ, O'Neill PA. Perceptions of how well graduates are prepared for the role of pre-registration house officer: a comparison of outcomes from a traditional and an integrated PBL curriculum. Med Educ. 2002;36:16–25.
Anderson ES, Lennox AI, Petersen SA. Learning from lives: a model for health and social care education in the wider community context. Med Educ. 2003;37:59–68.
Sinclair HK, Ritchie LD, Lee AJ. A future career in general practice? A longitudinal study of medical students and pre-registration house officers. Eur J Gen Pract. 2006;12:120–7.
Teherani A, Irby DM, Loeser H. Outcomes of different clerkship models: longitudinal integrated, hybrid, and block. Acad Med. 2013;88:35–43.
Tanaka K, Son D. Experiential learning for junior residents as a part of community-based medical education in Japan. Education for primary care. Educ Prim Care. 2019;30:282-8.
Daly M, Perkins D, Kumar K, Roberts C, Moore M. What factors in rural and remote extended clinical placements may contribute to preparedness for practice from the perspective of students and clinicians? Med Teach. 2013;35:900–7.
Denz-Penhey H, Campbell MJ. Rural learning is more than marks: sensitised to knowledge. Med Teach. 2008;30:781–6.
Christner JG, Dallaghan GB, Briscoe G, Casey P, Fincher RM, Manfred LM, et al. The community preceptor crisis: recruiting and retaining community-based faculty to teach medical students-a shared perspective from the Alliance for clinical education. Teach Learn Med. 2016;28:329–36.
Mader EM, Roseamelia CA, Lewis SL, Arthur ME, Reed E, Germain LJ. Clinical training in the rural setting: using photovoice to understand student experiences. Rural Remote Health. 2016;16:3877.
Wolff M, Young S, Maurana C. A senior elective: promoting health in underserved communities. Fam Med. 2001;33:732–3.
Glasser M, Stearns J, McCord R. Defining a generalist education: an idea whose time is still coming. Acad Med. 1995;70:S69–74.
Summerlin HH Jr, Landis SE, Olson PR. A community-oriented primary care experience for medical students and family practice residents. Fam Med. 1993;25:95–9.
Heestand Skinner DE, Onoka CA, Ofoebgu EN. Community-based education in Nigerian medical schools: students' perspectives. Educ Health (Abingdon, England). 2008;21:83.
Kaufman A. Rurally based education: confronting social forces underlying ill health. Acad Med. 1990;65:S18–21.
Takamura A, Misaki H, Takemura Y. Community and Interns' perspectives on community-participatory medical education: from passive to active participation. Fam Med. 2017;49:507–13.
Worley P. Relationships: a new way to analyse community-based medical education? (part one). Educ Health (Abingdon, England). 2002;15:117–28.
Noordam AC, Barbera Lainez Y, Sadruddin S, van Heck PM, Chono AO, Acaye GL, et al. The use of counting beads to improve the classification of fast breathing in low-resource settings: a multi-country review. Health Policy Plan. 2015;30:696–704.
Lee SW, Clement N, Tang N, Atiomo W. The current provision of community-based teaching in UK medical schools: an online survey and systematic review. BMJ Open. 2014;4:e005696.
Bowman RC, Penrod JD. Family practice residency programs and the graduation of rural family physicians. Fam Med. 1998;30:288–92.
Pathman DE, Steiner BD, Jones BD, Konrad TR. Preparing and retaining rural physicians through medical education. Acad Med. 1999;74:810–20.
Somporn P, Ash J, Walters L. Stakeholder views of rural community-based medical education: a narrative review of the international literature. Med Educ. 2018;52:791–802.
Thistlethwaite JE, Jordan JJ. Patient-centred consultations: a comparison of student experience and understanding in two clinical environments. Med Educ. 1999;33:678–85.
O'Sullivan M, Martin J, Murray E. Students' perceptions of the relative advantages and disadvantages of community-based and hospital-based teaching: a qualitative study. Med Educ. 2000;34:648–55.
Hillier M, McLeod S, Mendelsohn D, Moffat B, Smallfield A, Arab A, et al. Emergency medicine training in Canada: a survey of medical students' knowledge, attitudes, and preferences. CJME. 2011;13(251-8):e18–27.
Elam CL, Sauer MJ, Stratton TD, Skelton J, Crocker D, Musick DW. Service learning in the medical curriculum: developing and evaluating an elective experience. Teach Learn Med. 2003;15:194–203.
Recommendations for Clinical Skills Curricula for Undergraduate Medical Education. https://www.stfm.org/media/1363/clinicalskills_oct09qxdpdf.pdf accessed at June24, 2022.
The duties of a doctor registered with the General Medical Council 2016, https://www.gmc-uk.org/ethical-guidance/ethical-guidance-for-doctors/good-medical-practice/duties-of-a-doctor accessed at June24, 2022.
General Medical Council for graduates:Practical skills and procedures-practical 2019, https://www.gmc-uk.org/-/media/documents/practical-skills-and-procedures-a4_pdf-78058950.pdf accessed at June24, 2022.
Model Core Curriculum for Medical Education in Japan 2016, https://www.mext.go.jp/component/a_menu/education/detail/__icsFiles/afieldfile/2018/06/18/1325989_30.pdf accessed at June24, 2022.
Basic qualities and abilities required of a physician 2020, https://www.mhlw.go.jp/content/10800000/000719078.pdf accessed at June24, 2022.
Yamamoto T, Kawaguchi A, Otsuka Y. Developing the comprehensive medical professionalism assessment scale. MedEdPublish. 2019:1–15 https://mededpublish.org/articles/8-91 Accessed at 24 June 2022.
Floyd FJ, Widaman KF. Factor analysis in the development and refinement of clinical assessment instruments. Psychol Assess. 1995;7:286–99.
Ethics approval and consent to participate
Research was performed in accordance with the Declaration of Helsinki and was approved by the Ethics Committee approved this study protocol of Sapporo Medical University (3-1-58), and informed consent was obtained from all participants in the study.
Consent for publication
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Recommendations for Clinical Skills Curricula for Undergraduate Medical Education . Supplementary 2. The duties of a doctor registered with the General Medical Council 2016, UK . Supplementary 3. General Medical Council for graduates 2018 . Supplementary 4. General Medical Council for graduates: Practical skills and procedures-practical 2019 . Supplementary 5. Model Core Curriculum for Medical Education in Japan 2016 . Supplementary 6. Basic qualities and abilities required of a physician 2020 . Supplementary 7. Medical professional evaluation scale 
About this article
Cite this article
Wakabayashi, T., Tsuji, Y., Yamamoto, T. et al. Self-assessment scale for the community-based and emergency practice. BMC Med Educ 22, 799 (2022). https://doi.org/10.1186/s12909-022-03848-z
- Clinical clerkship
- Emergency practice