Skip to main content

Let’s ask the patient – composition and validation of a questionnaire for patients’ feedback to medical students

Abstract

Background

Adequate communication and maintaining a patient-centered approach throughout patient encounters are important skills for medical students to develop. Feedback is often provided by clinical teachers. Patients are seldom asked to provide feedback to students that systematically addresses knowledge and skills regarding communication and patient-centeredness during an encounter. One way for patients to provide feedback to students is through a questionnaire; there is, however, a lack of such validated feedback questionnaires. This study aimed to compose and validate a feedback questionnaire for patients’ feedback to medical students regarding students’ ability to communicate and apply patient-centeredness in clinical practice.

Method

This study comprises (a) composition of the questionnaire and (b) validation of the questionnaire. The composition included (1) literature review, (2) selection and composition of items and construction of an item pool, (3) test of items’ content, and (4) test of the applicability of the questionnaire. The items originated from the Calgary-Cambridge Guide (Kurtz S, Silverman J, Benson J and Draper J, Acad Med 78:802-809, 2003), the ‘Swedish National Patient Survey’ (National Patient Survey, Primary Health Care, 2020), patient evaluation form by Braend et al. (Tidsskr Nor Laegeforen 126:2122–5, 2006), and additional developed items. The items were further developed after feedback from 65 patients, 22 students, eight clinical supervisors, and six clinical teachers. The validation process included 246 patients who provided feedback to 80 students. Qualitative content analysis and psychometric methods were used and exploratory factor analysis assessed internal validity. Cronbach’s alpha was used to test the reliability of the items.

Results

The process resulted in the 19-item ‘Patient Feedback in Clinical Practice’ (PFCP) questionnaire. Construct validity revealed two dimensions: consultational approach and transfer of information. Internal consistency was high. Thematic analysis resulted in three themes: ability to capture the personal agenda of the consultation, alignment with the consultation, and constructs and characteristics. Students reported that the PFCP questionnaire provided useful feedback that could facilitate their learning in clinical practice.

Conclusions

The results of this study indicate that the questionnaire is a valid, reliable, and internally consistent instrument for patients’ feedback to medical students. The participants found the questionnaire to be useful for the provision of feedback in clinical practice. However, further studies are required regarding the PFCP questionnaire applicability as a feedback tool in workplace learning.

Peer Review reports

Background

Patient-centeredness is considered a key component to achieve high-quality care and increase patients’ participation in their own healthcare [1,2,3,4]. A patient-centered approach and related working methods include a framework for dialogue [5], a transfer of knowledge, patient and physician autonomy, and consultation skills [6]. The conceptualisation of patient-centered care has been developed over time, and various aspects and dimensions of the consultation have been enhanced [7,8,9,10]. Frameworks for patient-centered care have served as bases for evaluating various patient perspectives and experiences of healthcare [11]. Research has shown that patients would like to be more involved during the patient encounter [11], and that a focus on patients’ experience and knowledge during the encounter can contribute to more patient-centered care [12]. In 2015, a new version of the Swedish Patient Act presented to further strengthen patients’ role as collaborators in their own care [1]. Despite quality improvements and educational interventions, measurements and reports have repeatedly identified areas for improvement in patient care, including in dimensions such as consent, participation, and information [2, 3].

Research and reports have highlighted communication methodologies as tools for improving dialogue, patients’ satisfaction, and participation in their own care [13]. Communication and clinical skills are important core competencies for medical students to practice and develop [13,14,15]. Undergraduate medical education often involves early training in clinical skills and communication. Despite students’ education and training in communication and patient-centeredness, research has shown that students’ attitudes often shift from being patient-centered early in education to later being represented by a more traditional doctor-centered paternalistic approach [16, 17]. However, through specific and continuous education, students can develop and maintain an ability to apply a patient-centered approach throughout their education [17]. Students’ clinical supervisors are often the main providers of feedback regarding communication and patient-centeredness [18]. Patients’ participation in medical education can facilitate students’ abilities to evaluate their own communication and patient-centeredness proficiencies, positively influencing patients’ experiences of healthcare [19, 20]. Traditionally, patients’ involvement in medical education has been passive and objectified, hence illustrating specific conditions or clinical findings [20]. Over the last decades, patients’ involvement in various levels of medical education has evolved [21, 22]. However, patients seldom participate on a regular basis in medical students’ learning by providing feedback that directly targets knowledge and skills regarding communication and patient-centeredness from the patient’s perspective during a patient encounter [19, 23]. Research has shown that medical students often experience patients’ written feedback as generally encouraging, moderate, and positive [19] and more seldom as a substantial source of information that identifies students’ knowledge levels and knowledge gaps [24, 25].

Patient questionnaires provide an opportunity for patients to provide feedback in medical education. Internationally, a plethora of questionnaires have been developed for patients’ feedback to healthcare providers and medical doctors during residency [14, 26,27,28]. However, only a few questionnaires have been developed for patients’ feedback to medical students [25, 29, 30]. Previous questionnaires often target patients’ delayed, anonymous, nonspecific, and global feedback regarding experiences of provided healthcare. Moreover, these questionnaires have often included questions that asked patients to rate a student’s overall performance in certain domains, rather than concrete questions about patients’ subjective experience of the respective part of the consultation. For example, a currently available questionnaire may have a question such as, ‘Was the student telling you what you wanted to know about your symptoms and/or illness?’ [31], rather than asking the patient, ‘Did the student provide information about your symptoms and/or illness?’, and ‘Did the students provide information about your symptoms and/or illness in a way you understood?’ in order to make the feedback more actionable for the student. Patients have also often been asked to perform a global assessment of students as future healthcare workers through questionnaire items such as, ‘Imagine for a second that you could change to a new dedicated family physician; would you consider changing to he/she (the student) as your physician?’ [29]. However, while this type of feedback can be gratifying for students to receive, it is not actionable as a learning tool [32]. Furthermore, the content and structure of existing questionnaires are not in direct alignment with learning goals regarding patient-centeredness in medical education [7, 10].

The aim of this study was to compose and validate a feedback questionnaire for patients’ feedback to medical students regarding students’ ability to communicate and apply patient-centeredness in clinical practice.

Methods

Settings

This study was conducted between March 2016 and May 2018 at primary health care (PHC) centres in Stockholm County. To address the study’s aim, this research comprised (a) composition of the questionnaire and (b) validation of the questionnaire.

Context

Within the medical programme, Karolinska Institutet (KI), Stockholm Sweden, students perform clinical rotations at PHC centres during semesters one to11 (excluding semesters eight and 10). These placements last between four to 7 days per semester. Students’ early clinical training focuses on patients’ agendas and clinical examination [10]. Clinical reasoning during students’ early training is often addressed in dialogue between supervisors and students. The information that the students receive through the patients’ agendas is incorporated into the supervisor’s clinical reasoning and mutual agreement in order to highlight the importance of patient-centeredness as an entity throughout the patient encounter early in the students’ learning. In collaboration with their clinical supervisor, starting in semester three, students initiate training in clinical reasoning. Starting in semesters five and six, under supervision, students initiate training in performing the mutual agreement with patients. During semesters nine to 11, students perform the entire patient encounter under supervision.

Participants

The participants in this study were medical students from semesters two, four to seven, nine, and 11 who performed clinical practice at PHC centres. The study inclusion criteria were patients at the PHC centres, aged 18 years and older, and the exclusion criteria were patients with dementia, cognitive disabilities, or mental disorders.

Participants were invited into this study by e-mail (heads of PHC centres, medical students, and clinical supervisors). Patients were invited to participate at PHC centres through oral and written information and consent before the encounter at the PHC.

Composition of the questionnaire

The composition of the questionnaire included: (1) a literature review, (2) selection and composition of items and construction of an item pool, (3) test of items’ content, and (4) test of the applicability of the questionnaire (Fig. 1) [33, 34]. The ‘Consensus-Based Standards for the Selection of Health Measurement Instruments’ (COSMIN checklist) [35] was used as a guide in the composition process.

Fig. 1
figure 1

Flowchart of the (A) composition and (B) validation of the PFCP questionnaire

Literature review

A literature review was undertaken to identify existing questionnaires designed for patient feedback to medical students, residents, and specialists, with a focus on communication and patient-centeredness [36]. Before the literature review, key concepts were defined in order to target current models for communication and patient-centeredness that were in alignment with students’ learning goals.

Key concepts for the literature review were defined using:

  • the ‘Swedish National Patient Survey’ (information, knowledge, involvement, participation, respect, and attitude) [26]

  • the National Board of Health and Welfare guidelines for person-centered care [12]

  • Calgary-Cambridge guide [7]

  • the Pendleton model [9]

  • the generic model of doctor-patient communication developed at Maastricht Medical School [10]

  • the learning goals for Swedish medical education at KI regarding communication and patient-centeredness, and models of learning and training in workplace-based education [37]

A literature search was performed on PubMed, Web of Science, and Google Scholar. The following MeSH (medical subject headings) terms and key concepts were used: medical education’, ‘assessment’, ‘patient feedback’, ‘patient satisfaction’, ‘communication skills’, ‘questionnaire’, ‘patient-centeredness, ‘clinical competence’, ‘medical student’, ‘student learning’, and ‘family practice’.

Based on the literature review, inclusion and exclusion criteria for patient feedback questionnaires and aspects of patient-centeredness and communication were determined (Table 1).

Table 1 Inclusion and exclusion criteria for patient feedback questionnaires

The literature review resulted in 841 articles identifying 68 questionnaires for patient feedback based on inclusion and exclusion criteria. Three of the questionnaires were intended for patients’ feedback to medical students [25, 29, 30], and 65 of the questionnaires were identified as a form for provision of patient feedback to specific clinics, doctors, and residents, of which 12 questionnaires had been designed for educational purposes. None of the identified questionnaires included items that were all in alignment with the inclusion criteria of the current study.

Selection and composition of items and construction of an item pool

Items in the 68 questionnaires in alignment with the study inclusion criteria, with content relevant to Swedish medical care and education in the dimensions of information, knowledge, involvement and participation [25], served as the basis for the selection process. All questionnaires included several items with similar content in alignment with the study inclusion criteria, e.g. ‘Were you involved as much as you wanted to be in decisions about your care and treatment?’ [26, 38]. All questionnaires also included items that provided feedback exploring both patients’ satisfaction with and experience of a consultation. Items that measured more than one aspect, items that provided non-concrete feedback (e.g. judgmental adjectives), and items that included verbs describing emotions (e.g. ‘Do you feel this doctor listened to you?’) [38], were excluded in the selection process. The Swedish national patient survey and learning objectives at the KI medical programme served as important guideline documents in order to enable possible comparisons in future studies. The selection process resulted in 41 questionnaires for item content analysis by two of the study authors (K.B. and C.L.). A subsequent reduction process yielded seven questionnaires, which were documented in a spreadsheet that contained all seven questionnaire items. This list was reviewed by an expert group comprising four clinical lecturers at the Division of Family Medicine and Primary Care, KI. These experts were responsible for teaching patient-centered communication techniques at the medical programme, KI, as well as in residency and CME courses at a national level. These experts read the list repeatedly before selecting items that they perceived best to correlate to a certain domain in the intended learning outcomes regarding communication and patient-centeredness. The result was then processed repeatedly by the team of authors, and the final selection of items was confirmed by the expert group, reaching a consensus.

In total, 27 items were selected from the a) ‘Swedish National Patient Survey’ (n = 12) [26], b) the Calgary-Cambridge Guide (n = 11) [7], and c) patient evaluation form by Bread et al. (n = 1) [29]. Three complementary items were developed in order to include aspects of patients’ perceived experience of consultation and participation information throughout their patient encounter, according to inclusion criteria – for example, ‘Did the student ask if the information you were given was interpretable?’ [34]. Considerations of items were made with the intention to reduce the ceiling effect of the patients’ assessments. The selected items were reframed and modified – for example, in the direction towards a subjective perception of the patient encounter – and worded as open-ended, direct questions (e.g. ‘Did you have the opportunity to tell the student in your own words about your problem?’ was changed to ‘Did you have the opportunity to explain, in your own words, the reason for your visit, or what has happened since you last visited the doctor?’).

Test of items’ content

To determine how well the 27 items’ content captured the intended aspects of patient-centered communication (face validity) and allowed patients to provide, and students to receive, patients’ feedback about the patient encounter, discussions were held with a group of four content-experts who participated in the selection process. The content-experts are clinical lecturers at the Division of Family Medicine and Primary Care, KI. At three PHC centres, semi-structured interviews were conducted with patients (n = 44) before or after a patient encounter. An initial evaluation of the result of patient interviews performed before consultation showed that some items were often interpreted as similar. However, after the patients had experienced their consultation, they no longer considered the items similar and believed that the items would provide valuable differential feedback on the subjects in the items. The items were evaluated for their ability to target important areas of feedback, clarification of feedback subject, comfort in providing feedback in the respective item content area, and linguistic interpretability. The items were also evaluated by students (n = 15) and clinical supervisors (n = 4) through semi-structured interviews during students’ clinical placement at PHC centres [34]. These interviews were transcribed and analysed using deductive content analysis [39], which provided information for the selection and inclusion of items to the questionnaire. Throughout this process, items were modified and reduced to minimise overlap and to direct patients’ attention in the items towards patient-focus, rather than student-focus during the encounter, e.g. ‘Did the student ask if there was something that you were worried about regarding your problem?’ was changed to ‘Did you have the opportunity to explain if there were something that worried you regarding your problem?’

The item composition process resulted in 19 items (items 1–8, and 14 were derived from the Calgary-Cambridge Guide [7]; items 10–12, 15–16, and 18–19, were derived from the ‘Swedish National Patient Survey’ [26]; items 1, 3, and 14 occurred both in the Calgary-Cambridge Guide and the ‘Swedish National Patient Survey’; item 17 was derived from the patient evaluation form by Braend et al.17 [29], and items 9 and 13 were formulated in discussions between the research group and experts. The items were connected to a six-point Likert scale with clarifying text for each scale step (from strongly disagree to strongly agree). ‘Not applicable’ and ‘Performed by supervisor’ was included as an additional option, and space for free-text feedback was included at the end of the questionnaire.

Test of the applicability of the questionnaire

Data collection tools: written surveys and an interview guide to evaluate the questionnaire during the applicability test

Before the PFCP questionnaire applicability test, two evaluation surveys were formulated to explore students’ and clinical supervisors’ experiences of the PFCP questionnaire [40]. The evaluation surveys were also used as a guide for semi-structured interviews with students and clinical supervisors during the applicability test of the questionnaire.

Data collection during the applicability test of the questionnaire

To test the applicability of the PFCP questionnaire for provision of feedback and usefulness as a learning and teaching tool, patients (n = 25) completed the questionnaire with feedback to medical students (n = 7) directly after the encounter in PHC centres (n = 3). After the patients had completed the PFCP questionnaire, they were interviewed by the use of a semi-structured interview guide, to explore the questionnaire’s ability to capture their perspectives and experiences during an encounter. Students and clinical supervisors (n = 7) completed an evaluation survey or were interviewed to explore the perceived usability of the patient feedback from the PFCP questionnaire as a tool for learning and clinical supervision. K.B. (first author) collected these PFCP questionnaire forms and evaluation surveys for analysis.

Data from the PFCP questionnaire regarding the patients’ experience during the student-led encounter was analysed; results are not presented in this paper. Data from the evaluation surveys and interviews were analysed using inductive qualitative content analysis [39]. The questionnaire applicability test showed that the clinical supervisors tended to disregard the patients’ use of the entire six-point Likert scale in rating students’ perceived performance. Instead, the clinical supervisors often interpreted the patients’ ratings in the 5–6 range as indicating an overall adequate student performance, disregarding the patients’ intent to suggest areas for students’ improvement. The Likert scale was therefore changed from a six-point Likert scale to a four-point Likert scale in order to address this concern and reduce the observed ceiling effect. To further provide students with interpretable and useful feedback, space was included for free-text comments after each item (Table 2).

Table 2 The 19 items included in the Patient’s Feedback in Clinical Practice (PFCP) questionnairea

Validation of the questionnaire

Data collection during the validation of the questionnaire

Using the PFCP questionnaire, patients provided feedback to medical students. The feedback was provided after an encounter performed by the medical student, under the supervision of a clinical supervisor. Semi-structured interviews were conducted with patients to evaluate their experiences with the PFCP questionnaire. The student and the clinical supervisor received the patient feedback at the end of the same day on which the encounter with the patient had taken place and the patient feedback provided. After having taken into account the patient feedback, students and clinical supervisors completed an evaluation survey or were interviewed to evaluate the PFCP questionnaire as a tool for medical students’ clinical education. The evaluation surveys and interview guide from the applicability test were also used to validate the questionnaire. K.B. collected the PFCP questionnaire and evaluation surveys for analysis.

Participants and sampling

In total, 246 patients (143 women and 103 men, ages 18–91), 80 medical students (51 women and 29 men, ages 18–52), and 27 clinical supervisors (14 women and 13 men, ages 25–66) at PHC centres (n = 8) in Stockholm City County participated in the validation of the questionnaire. Six additional students agreed to participate but did not provide written consent; these students and their patients (n = 10) were excluded from the study. Two additional patients were excluded, one of whom did not completely fill in all of the questionnaire items and one of whom provided feedback to two students who had collaborated during the patient encounter.

Analysis

Statistical analysis: internal consistency, construct validity, and reliability

Exploratory factor analysis (EFA) was used to assess how well the items of the PFCP questionnaire measured what they were intended to measure (content validity) and to explore associations among the items (internal validity). Furthermore, EFA was used to control any grouping tendency between the items, discern underlying factors within each factor, and reduce items that might be distributed across more than one factor [34]. Oblique rotation was used to clarify the items’ grouping [34]. After changing the Likert scale in the questionnaire to a four-point Likert scale, the ceiling effect that had been noted during the test of the applicability of the questionnaire was found to be less prominent. Related confounders (patient age and gender; student age, gender, and current semester) of these items were controlled by multivariate analysis of covariance (MANCOVA) (results not shown in this paper). Testing of the items’ magnitude in the factor models and internal consistency of the item construct was controlled by Cronbach’s alpha coefficient [34]. Cronbach’s alpha values from 0.6 to 1.0 were considered acceptable [34]. SAS 9.4 (SAS Institute Inc., Cary, NC) software was used for the statistical analyses.

Qualitative content analysis: participants’ experiences in the validation of the questionnaire

Data from the transcribed semi-structured interviews with patients (n = 91), students (n = 6), and clinical supervisors (n = 3), and text from the written evaluation surveys from students (n = 61) and clinical supervisors (n = 11), was analysed using qualitative content analysis [39].

The qualitative data were analysed using inductive qualitative content analysis.

  • The interviews were audiotaped and transcribed, and the students’ and clinical supervisors’ written feedback from the evaluation surveys was documented.

Interviews

  • The text from the transcribed interviews was read repeatedly for a global understanding by K.B. and C.L., and notes were taken.

  • Meaning units were identified.

  • The units were condensed according to perceived key content areas.

  • The units were compared in order to ensure consistency.

  • The meaning units were sorted into categories established by K.B. and C.L.

Evaluation surveys

  • The text from the evaluation surveys was read repeatedly for global understanding by K.B. and C.L., and notes were taken.

  • The units were condensed according to perceived key content areas.

  • The units were compared in order to ensure consistency.

  • The meaning units were sorted into categories established by K.B. and C.L.

Final step

  • The underlying meanings of the categories from the interviews and evaluation surveys were interpreted and merged, resulting in three themes which were established by K.B. and C.L.

  • These three themes are presented in the ‘Results’ section below.

Quantitative data from the evaluation survey’s mean and range were documented for each question (presented under respective themes in the ‘Results’ section below).

Results

Statistical results: internal consistency, construct validity, and reliability

The exploratory factor analysis resulted in two dimensions: consultational approach (F1) and transfer of information (F2). F1 includes items 1–5, 7, and 18–19, and F2 includes items 6 and 8–17. The two dimensions’ values from EFA and revised factor analysis [41] are presented in Table 3, whose rows correspond to the variables from items 1–19 and whose columns correspond to F1 and F2, with variance explained by each factor before and after rotation, mean (SD), and Cronbach’s alpha (when items were removed). The internal consistency of the scale was interpreted as high. The Cronbach’s alpha coefficients ranged between 0.89 and 0.91. No data was missing in the analysis process.

Table 3 Factor loading and descriptive analysis of the PFCP questionnairea

Qualitative results: participants’ experience of validation of the questionnaire

The thematic analysis of data from the validation of the questionnaire, including the evaluation survey and interviews, resulted in three themes: ability to capture the personal agenda of the consultation, alignment with the consultation, and structure and content. Table 4 shows a summary of patients’, students’, and clinical supervisors’ perspectives of the PFCP questionnaire as a pedagogical feedback tool.

Table 4 Summary of patients, students, and clinical supervisors’ perspectives of the PFCP questionnaire as a pedagogical feedback tool

Theme 1: Ability to capture the personal agenda of the consultation

Patients

The questionnaire provided the patients with a tool that facilitated their interpretation of the consultation in relation to the patient’s personal agenda, e.g. questions that had been asked, examinations that had been performed, information that had been provided, and decisions that had been mutually agreed upon. The questionnaire was also perceived to clarify important aspects of the patient’s own care.

‘I thought it was good, very straightforward, and so, very easy to separate the parts. I thought it described the visit pretty well, what we went through and so.. .’

Students

Students (n = 61) evaluation surveys explored how patients’ feedback helped them visualise the pedagogical assignment a student has during a patient encounter (using a 4-point Likert scale, mean 3.5).

The students stated that an awareness of their assignment in relation to the patient was further clarified. The medical assignment, to recognise and interpret symptoms and perform an adequate clinical examination, the students stated the necessity to provide patients with clarifying information throughout the entire consultation. The pedagogical assignment to facilitate the process of mutual agreement, by considering the patient’s level of knowledge and concerns, was perceived as targeted through the patient’s feedback.

‘I received a greater understanding, that the clinical examination is not only for me to find a diagnosis, but also for the patient to feel well examined’.

‘It made me realise. .. next time I should explain a little more about what I examine etc.’.

Clinical supervisors

The patients’ perspectives obtained from the PFCP questionnaire were believed to underpin the clinical supervisors’ pedagogical assignment to provide feedback regarding the students’ level of patient-centeredness. The PFCP questionnaire was also considered to facilitate and legitimise dialogue with students regarding important aspects of patient-centeredness within each part of a consultation.

‘It is important to think about all the steps, such as medicines and further diagnostic interventions. It is easy to forget certain steps during the feedback, the patient’s feedback strengthened and provided a structure for the feedback. .. ’.

Theme 2: Alignment with the consultation

Patients

The authenticity of the questionnaire regarding the structure and content of the consultation was perceived by the patients to be high.

They [the items] were in alignment with the experienced encounter; it almost felt as if he was asking questions directly from the questionnaire’.

Students

Students believed the questionnaire to concretise and target learning goals and to provide structured feedback throughout the consultation.

‘... highlighted all the parts that should be included in the patient consultation’.

‘The importance of the summary to provide clarity. .. ’.

Students experienced that patients’ feedback highlighted the importance of including a patient-centered approach in dialogue with patients in order to increase patients’ participation during the encounter.

‘The importance of. .. responding to the patient’s questions [and] taking the patient’s agenda into consideration. .. became very clear in the feedback. .’.

Clinical supervisors

The questions in the PFCP questionnaire were found to be in alignment with the expected structure and content of a patient encounter. The questions were also perceived to facilitate the supervisors in the identification of the necessity to provide feedback to students regarding patient-centeredness.

‘The questionnaire is designed in alignment with the consultation’.

‘... I have not previously focused enough on feedback during the final parts of the patient encounter, such as how the patient has perceived the encounter and, for example, how the student has ensured that the patient understands … ’.

Theme 3: Construct and characteristics

Patients

The patients experienced that the questionnaire targeted important content and strengthened their ability to provide relevant feedback to the students.

‘... the items were good, they adequately visualised what the student could do and what she did’.

‘It was good, it was both the human factors about how to talk to the patient as well as the medical, so it was both’.

The questionnaire allowed the patients to state which parts of the consultation were performed by the student and the clinical supervisor, respectively. However, in some cases, patients found it challenging to isolate a student’s performance from the interference of the clinical supervisor.

‘Was it the student who did it well, or was it this doctor who did it well? It is good that I can also mark that the supervisor performed’.

Patients described that the items clarified their experience of the encounter and facilitated their provision of feedback. However, patients also regarded the opportunity to write free-text comments as important.

‘That I had the opportunity to write my own answers was really good’.

Students

The students’ (n = 61) evaluation survey indicated that patients’ feedback provided valuable information regarding students’ abilities to apply patient-centered communication (3.4 out of 4 on the Likert scale), as well as guidance for continuous training in clinical skills (3.2 out of 4 on the Likert scale).

‘Clear feedback when a patient experienced that I answered their ideas, concerns and expectations’.

Clinical supervisors

The clinical supervisors’ (n = 22) evaluation survey indicated that patients’ feedback could provide valuable information regarding students’ abilities to apply patient-centered communication (3.3 out of 4 on the Likert scale), as well as guidance on students’ continued training in clinical competencies (2.7 out of 4 on the Likert scale), while also facilitating the students’ visualisation of their pedagogical assignment in their dialogue with patients (3.3 out 4 on the Likert scale).

Clinical supervisors stated that patients’ feedback added perspective and provided valuable, applicable information for their own feedback to students regarding how to communicate and provide information to patients.

‘The questions were so specific that I could give concrete examples of how to do things differently’.

‘You received more information about how the student responds to the patient and solves problems’.

Overall, clinical supervisors expressed that the time invested in using the PFCP questionnaire was proven to be well spent. Some clinical supervisors declined to participate due to stress and time loss incurred. However, very few clinical supervisors maintained such hesitation after participating in this study.

‘Structured, good, but sometimes time-consuming’.

‘Literally complicated at first. Took a little longer but provided a very good structure’.

Discussion

This study focused on the composition and validation of a feedback questionnaire to allow patients to assess their experiences of core communication and patient-centeredness aspects during a patient encounter in order to provide medical students with feedback for the identification of knowledge gaps and areas for development. The composition of the items resulted in a questionnaire with 19 items (nine partly adapted items from the Calgary-Cambridge Guide [7], seven items from the ‘Swedish National Patient Survey’ [26], one item from the patient evaluation form by Braend et al.17 [29], and two complementary items). The results from analysis and interpretation of data indicated that the PFCP questionnaire is a valid, reliable, and internally consistent instrument for patients’ feedback to medical students.

The item selection process followed a reductive and adaptive process, including mixed methods, to support content and face validity. In the selection process a framework for communication and patient-centeredness (Calgary Cambridge Guide [7] and Maastricht Medical School [10]) was used and included an initial evaluation with experts and evaluations with patients, students, and clinical supervisors during several steps [34, 35].

To consider whether power in the psychometric evaluation was adequate, the most crucial consideration is the relationship between how well the items’ loaded on factors and the study’s sample size. The recommended sample size is approximately 10 participants for each item. Taking these factors into account, the sample size of 246 was considered reliable [42].

To provide evidence for construct validity, reliability, and internal consistency, several psychometric assessments were performed in alignment with previous studies [43]. Cronbach’s alpha supported the inclusion of each item in the questionnaire and showed that none of the items measured the same construct, which supported both the reliability of the two-factor structure and the internal consistency [34]. Background factors did not affect the factor structure significantly [33], even though the distribution was disproportionate concerning gender, age, and study years, which also supported the items’ inclusion in the questionnaire.

Analysis of the interviews from the validation process revealed coherent data that did not suggest further aspects compared to the data obtained from the written evaluation surveys. The process for analysing data from the evaluation surveys and interviews was identical and clearly described to detect dependability issues.

Statistical [33, 34] and content analysis [39] confirmed that the two factors sufficiently covered the core aspects of communication and patient-centeredness, in alignment with other studies [7, 10, 26]. The items’ content enables patients to provide concrete, interpretable, and actionable feedback regarding their experiences during an encounter, as previous studies have described [44].

During the composition process, the Likert scale was changed from a six-point to a four-point scale. The intention guiding this change was that students and clinical supervisors should be able to discriminate between and interpret the patients’ feedback to identify areas for improvement. Students and clinical supervisors described that the feedback identified important areas for improvement in alignment with students’ learning goals regarding communication and patient-centeredness [37]. The alteration of the scale, by using fewer scale steps, indicated that the four-grade Likert scale more clearly identified an area for improvement than the six-grade Likert scale.

Patients provided high-score feedback for the items in the dimension consultational approach, which aligned with the results of previous studies [30, 45]. For the second dimension, transfer of information, the ceiling effect observed was slightly lower, which is also in alignment with previous studies [29, 30, 45], and suggested that students, in general, are skilful at applying patient-centered communication in certain areas through their training, for example, in history taking [29].

In some previous studies, students had not expressed an interest in receiving further feedback from patients [29, 30]. However, in the current study, students regarded patients’ feedback as valuable for inclusion in their self-regulated learning process. This finding could perhaps be explained by the structure and content of the questionnaire. To further provide concrete feedback, patients were able to add written free-text clarifying comments. Both in this study and previous studies, patients considered the opportunity to clarify their feedback to be important [29, 46]. Several patients in this study also reported that the items adequately encapsulated their feedback regarding their experience of the encounter, stating that further clarification, therefore, was unnecessary.

The authors of previous studies and patient surveys have often advocated for anonymous patient feedback as a favourable approach to creating a safe environment in which to provide feedback regarding patients’ subjective experiences of an encounter [12, 30]. In the current study, patients were not anonymous, which could have affected their willingness both to participate in the study and provide feedback due to their plausible dependability upon their caregiver [19, 47]. The patients in this study indicated that the items’ content gave them opportunities to respond to specific aspects of an encounter and to assess concrete, non-emotional aspects of their experience. Students also confirmed that they had perceived the questionnaire items as concrete, interpretable, and actionable, visualising important aspects to develop regarding patient-centered communication techniques. Students also reported that their ability to relate patients’ feedback to their own experience of a particular encounter was important in order to facilitate their identification of learning gaps to address in further clinical training.

The clinical supervisors who participated in this study suggested that the PFPC questionnaire had clarified their pedagogical assignment to provide feedback regarding patient-centered communication. Furthermore, they reported that the questionnaire provided a structure for providing feedback, which to our knowledge, has not been discussed in previous studies [29, 30, 48]; however, these aspects should be explored further.

Strengths and limitations

One strength of this study is that the composition and validation of the PFCP questionnaire used mixed methods and extensive data, which were collected and analysed using both statistical and qualitative methods in an effort to compose a valid, reliable questionnaire for patient feedback. The items originating from previously described and established questionnaires further added face validity to the PFCP questionnaire. A second strength is that patients, students, and clinical supervisors all reported that the PFCP questionnaire had targeted important clinical competencies in the areas of communication and patient-centeredness.

The PFCP questionnaire was composed in a Swedish medical education context, which could be considered a limitation. However, the content was based on common theories of communication and patient-centeredness, with offspring from work in alignment with the Calgary Cambridge Guide [7] and Maastricht Medical School [10], communicational guides commonly used in Western medical education. A second limitation could be that patients might find it difficult to discriminate whether the student or the clinical supervisor was the actual main provider of the given information and take-home message. Neither students nor clinical supervisors commented on these elements in their reflections on the questionnaire’s feedback applicability. A third limitation could be that data from the test of the items’ content were analysed using deductive content analysis. However, during this part of the process, the intention was to evaluate selected items related to the theories and knowledge of the subject communication and patient-centredness [49]. All other data was analysed using inductive content analysis to explore the participants’ perspectives and experiences of the PFCP questionnaire as a feedback provider [49].

Implications for medical education and future research

Our results indicate that the PFCP questionnaire could serve as a valuable tool for increasing patients’ participation in medical students’ workplace learning. However, further analyses are required to explore students’ learning as a result of the PFCP questionnaire, assessed in a summative setting. Patients’ feedback obtained through the PFCP questionnaire could perhaps also serve as a progressive indicator of the level of knowledge and clinical competences in relation to milestones throughout medical education. Exploring how clinical supervisors can use feedback from the PFCP questionnaire as an addition to their own feedback requires further research. The fact that patients experienced a clarification of the encounter’s structure and content while completing the PFCP questionnaire could be a subject for further studies exploring how to provide patients with knowledge about patient-centered working methods.

Conclusions

The results of this study indicate that the questionnaire is a valid, reliable, and internally consistent instrument for patients’ feedback to medical students. Patients, students, and clinical supervisors found the PFCP questionnaire provided useful feedback that could facilitate students’ learning regarding communication skills and patient-centeredness in clinical practice.

Availability of data and materials

The data generated and analysed during the current study are not publicly available due to ethics approval.

Abbreviations

COSMIN checklist:

Consensus-based Standards for the selection of health status Measurement Instruments

EFA:

Explorative factor analysis

F1:

Factor 1

F2:

Factor 2

KI:

Karolinska Institutet

MANCOVA:

Multivariate analysis of covariance

MeSH:

Medical subject headings

PHC:

Primary health care

PFCP:

‘Patient’s Feedback in Clinical Practice’

References

  1. Ministry of Health and Social Affairs. In: Goverment S, editor. The Patient Act: Government Offices of Sweden: Ministry of Health and Social Affairs; 2015. http://www.regeringen.se/contentassets/b1a9ef9b43e9468f9345fcdbe8c60fe9/patientlag. Accessed 19 Aug 2016.

  2. The Swedish Agency for Health and Care Services Analysis. The health care from the patients' perspective - comparisons between Sweden and 10 other countries. Stockholm: The Swedish Agency for Health and Care Services Analysis; 2016. https://www.vardanalys.se/rapporter/varden-ur-befolkningens-perspektiv-2016/. Accessed 14 Mar 2021.

  3. The Swedish Agency for Health and Care Services Analysis. Act without impact, Evaluation of the Patient Act. 2014–2017. Stockholm: The Swedish Agency for Health and Care Services Analysis; 2017. https://www.vardanalys.se/rapporter/lag-utan-genomslag/. Accessed 14 Mar 2021.

  4. Phillips NM, Street M, Haesler E. A systematic review of reliable and valid tools for the measurement of patient participation in healthcare. BMJ Qual Saf. 2016;25(2):110–7. https://doi.org/10.1136/bmjqs-2015-004357.

    Article  Google Scholar 

  5. Barry MJ, Edgman-Levitan S. Shared decision making--pinnacle of patient-centered care. N Engl J Med. 2012;366(9):780–1. https://doi.org/10.1056/NEJMp1109283.

  6. Brown JB, Weston WW, Stewart M. The third component: finding common ground. In: Stewart M, Brown J, Weston W, editors. Patient-Centred medicine: transforming the clinical method. 2nd ed. United Kingdom: Radcliffe Medical Press; 2003. p. 84–99.

    Google Scholar 

  7. Kurtz S, Silverman J, Benson J, Draper J. Marrying content and process in clinical method teaching: enhancing the Calgary-Cambridge guides. Acad Med. 2003;78(8):802–9. https://doi.org/10.1097/00001888-200308000-00011.

  8. Balint E. The possibilities of patient-centered medicine. J R Coll Gen Pract. 1969;17(82):269–76.

    Google Scholar 

  9. Pendleton D, Schofield T, Tate P, et al. The consultation: an approach to learning and teaching. 6th ed. Oxford: Oxford University Press; 1984.

    Google Scholar 

  10. Van Dalen J, Bartholomeus P, Kerkhofs E, et al. Teaching and assessing communication skills in Maastricht: the first twenty years. Med Teach. 2001;23(3):245–51. https://doi.org/10.1080/01421590120042991.

    Article  Google Scholar 

  11. Docteur E, Coulter A. Patient-centeredness in Sweden’s health system - an external assessment and six steps for progress: The Swedish Agency for Health and Care Services Analysis, Stockholm; 2012. https://www.vardanalys.se/rapporter/patientcentrering-i-svensk-halso-och-sjukvard/. Accessed14 Mar 2021.

  12. The National Board of Health and Welfare. The patient perspective in national guidelines. Stockholm; 2019. https://www.socialstyrelsen.se/regler-och-riktlinjer/nationella-riktlinjer/om-nationella-riktlinjer/perspektiv-i-riktlinjerna/. Accessed 24 Mar 2021.

  13. Zachariae R, O'Connor M, Lassesen B, et al. The self-efficacy in patient-centeredness questionnaire - a new measure of medical student and physician confidence in exhibiting patient-centered behaviors. BMC Med Educ. 2015;15(1):150. https://doi.org/10.1186/s12909-015-0427-x.

    Article  Google Scholar 

  14. Brouwers M, Rasenberg E, van Weel C, Laan R, van Weel-Baumgarten E. Assessing patient-centred communication in teaching: a systematic review of instruments. Med Educ. 2017;51(11):1103–17. https://doi.org/10.1111/medu.13375.

    Article  Google Scholar 

  15. Choudhary A, Gupta V. Teaching communications skills to medical students: introducing the fine art of medical practice. Int J Appl Basic Med Res. 2015;5(4):41. https://doi.org/10.4103/2229-516X.162273.

    Article  Google Scholar 

  16. Moral RR, Garcia de Leonardo C, Caballero Martinez F, et al. Medical students' attitudes toward communication skills learning: comparison between two groups with and without training. Adv Med Educ Pract. 2019. https://doi.org/10.2147/AMEP.S182879.

  17. Tsimtsiou Z, Kerasidou O, Efstathiou N, Papaharitou S, Hatzimouratidis K, Hatzichristou D. Medical students' attitudes toward patient-centred care: a longitudinal survey. Med Educ. 2007;41(2):146–53. https://doi.org/10.1111/j.1365-2929.2006.02668.x.

    Article  Google Scholar 

  18. Reinders ME, Blankenstein AH, van der Horst HE, Knol DL, Schoonheim PL, van Marwijk HWJ. Does patient feedback improve the consultation skills of general practice trainees? A controlled trial. Med Educ. 2010;44(2):156–64. https://doi.org/10.1111/j.1365-2923.2009.03569.x.

    Article  Google Scholar 

  19. Chua IS, Bogetz AL. Patient feedback requirements for medical students: do perceived risks outweigh the benefits? Clin Pediatr. 2018;57(2):193–9. https://doi.org/10.1177/0009922817696464.

    Article  Google Scholar 

  20. Towle A, Bainbridge L, Godolphin W, Katz A, Kline C, Lown B, et al. Active patient involvement in the education of health professionals. Med Educ. 2010;44(1):64–74. https://doi.org/10.1111/j.1365-2923.2009.03530.x.

    Article  Google Scholar 

  21. Dijk SW, Duijzer EJ, Wienold M. Role of active patient involvement in undergraduate medical education: a systematic review. BMJ Open. 2020;10(7):e037217. https://doi.org/10.1136/bmjopen-2020-037217.

    Article  Google Scholar 

  22. Towle A, Godolphin W. A meeting of experts: the emerging roles of non-professionals in the education of health professionals. Teach High Educ. 2011;16(5):495–504. https://doi.org/10.1080/13562517.2011.570442.

    Article  Google Scholar 

  23. Bing-You R, Hayes V, Varaklis K, Trowbridge R, Kemp H, McKelvy D. Feedback for learners in medical education: what is known? A Scoping Review. Acad Med. 2017. https://doi.org/10.1097/ACM.0000000000001578.

  24. Braend AM, Gran SF, Frich JC, Lindbaek M. Medical students' clinical performance in general practice - triangulating assessments from patients, teachers and students. Med Teach. 2010;32(4):333–9. https://doi.org/10.3109/01421590903516866.

    Article  Google Scholar 

  25. Hogan N, Li H, Pezaro C, Roberts N, Schmidt E, Martin J. Searching for a written patient feedback instrument for patient-medical student consultations. Adv Med Educ Pract. 2017;8:171–8. https://doi.org/10.2147/AMEP.S119611.

  26. Sweden County Councils and Regions in Collaboration. National Patient Survey, Primary Health Care. 2020. https://patientenkat.se/sv/resultat/primarvard-2017/. Accessed 14 Mar 2021.

  27. Marshall GN, Hays RD. The patient satisfaction questionnaire short-form (PSQ-18). Santa Monica: RAND Corporation; 1994. https://www.rand.org/content/dam/rand/pubs/papers/2006/P7865.pdf. Accessed 14 Mar 2021.

  28. Wolf MH, Putnam SM, James SA, Stiles WB. The medical interview satisfaction scale: development of a scale to measure patient perceptions of physician behavior. J Behav Med. 1978;1(4):391–401. https://doi.org/10.1007/BF00846695.

    Article  Google Scholar 

  29. Brænd ML, Gran SF, Lindbæk M. Patients - useful resource in evaluating medical students' clinical practice? Tidsskr Nor Laegeforen. 2006;126(16–24):2122–5.

  30. Reinders ME, Blankenstein AH, Knol DL, de Vet HCW, van Marwijk HWJ. Validity aspects of the patient feedback questionnaire on consultation skills (PFC), a promising learning instrument in medical education. Patient Educ Couns. 2009;76(2):202–6. https://doi.org/10.1016/j.pec.2009.02.003.

    Article  Google Scholar 

  31. Grol R, Wensing MP. Patients Evaluate General/Family Practice, The EUROPEP instrument. Mediagroep KUN/UMC: 2000. https://equip.woncaeurope.org/sites/equip/files/documents/publications/resources/grolwensing2000theeuropepinstrumentequipwonca.pdf. Accessed 14 Mar 2021.

  32. Hattie J, Clarke S. Visible learning: feedback. Oxon: Routledge; 2019.

    Book  Google Scholar 

  33. The American Educational Research Association, The American Psychological Association, The National Council on Measurement in Education. The Standards for Educational and Psychological Testing. Washington: American Educational Research Association; 2014.

    Google Scholar 

  34. Streiner D, Norman G. Health measurement scales, a practical guide to develpment and use. 4:e ed. Oxford: Oxford university Press; 2008. https://doi.org/10.1093/acprof:oso/9780199231881.001.0001.

    Book  Google Scholar 

  35. Terwee CB, Mokkink LB, Knol DL, Ostelo RWJG, Bouter LM, de Vet HCW. Rating the methodological quality in systematic reviews of studies on measurement properties: a scoring system for the COSMIN checklist. Qual Life Res. 2011;21(4):651–7. https://doi.org/10.1007/s11136-011-9960-1.

    Article  Google Scholar 

  36. Grant MJ, Booth A. A typology of reviews: an analysis of 14 review types and associated methodologies. Health Inf Libr J. 2009;26(2):91–108. https://doi.org/10.1111/j.1471-1842.2009.00848.x.

  37. Medical progamme, Karolinska Institutet. Course Public Health and Environmental Medicine, 12 credits: Karolinska Institutet, Dept of Environmental Medicine; 2019. Course syllabus. https://education.ki.se/course-syllabus/2LK100. Accessed 28 Nov 2019.

  38. Royal College of General Practitioners. Patient Satisfaction Questionnarie (PSQ): Royal College of General Practitioners; 2021. https://www.rcgp.org.uk/training-exams/training/new-wpba/psq.aspx. Accessed 28 Jan 2021.

  39. Graneheim UH, Lundman B. Qualitative content analysis in nursing research: concepts, procedures and measures to achieve trustworthiness. Nurse Educ Today. 2004;24(2):105–12. https://doi.org/10.1016/j.nedt.2003.10.001.

  40. Patel R, Davidson B. Fundamentals of research methodology: to plan, perform and report a study. 5th ed. Lund: Studentlitterature; 2020.

    Google Scholar 

  41. Velicer WF, Eaton CA, Fava JL. Construct explication through factor or component analysis: a review and evaluation of alternative procedures for determining the number of factors or components. In: Goffin RD, Helmes E, editors. Problems and solutions in human assessment. MA: Kluwer; 2000. p. 41–71. https://doi.org/10.1007/978-1-4615-4397-8_3.

    Chapter  Google Scholar 

  42. MacCallum RC, Browne MW, Sugawara HM. Power analysis and determination of sample size for covariance structure modeling. Psychol Methods. 1996;1(2):130–49. https://doi.org/10.1037//1082-989x.1.2.130.

    Article  Google Scholar 

  43. Strand P, Sjoborg K, Stalmeijer R, et al. Development and psychometric evaluation of the undergraduate clinical education environment measure (UCEEM). Med Teach. 2013;35(12):1014–26. https://doi.org/10.3109/0142159X.2013.835389.

    Article  Google Scholar 

  44. Burford B, Greco M, Bedi A, Kergon C, Morrow G, Livingston M, et al. Does questionnaire-based patient feedback reflect the important qualities of clinical consultations? Context, benefits and risks. Patient Educ Couns. 2011;84(2):e28–36. https://doi.org/10.1016/j.pec.2010.07.044.

    Article  Google Scholar 

  45. Campbell C, Lockyer J, Laidlaw T, MacLeod H. Assessment of a matched-pair instrument to examine doctor-patient communication skills in practising doctors. Med Educ. 2007;41(2):123–9. https://doi.org/10.1111/j.1365-2929.2006.02657.x.

    Article  Google Scholar 

  46. Al-Jabr H, Twigg MJ, Scott S, et al. Patient feedback questionnaires to enhance consultation skills of healthcare professionals: a systematic review. Patient Educ Couns. 2018;101(9):1538–48. https://doi.org/10.1016/j.pec.2018.03.016.

    Article  Google Scholar 

  47. British Medical Association. Role of the patient in medical education. British Medical Association. BMA Medical Education Subcommitte of the Board of Science and the BME Science and Education Department; 2008. https://www.yumpu.com/en/document/read/18917736/role-of-the-patient-in-medical-education-british-medical-bma. Accessed 24 Jan 2020.

  48. Lyons O, Willcock H, Rees J, Archer J. Patient feedback for medical students. Clin Teach. 2009;6(4):254–8. https://doi.org/10.1111/j.1743-498X.2009.00308.x.

    Article  Google Scholar 

  49. Elo S, Kyngas H. The qualitative content analysis process. J Adv Nurs. 2008;62(1):107–15. https://doi.org/10.1111/j.1365-2648.2007.04569.x.

    Article  Google Scholar 

Download references

Acknowledgements

We gratefully acknowledge the heads of the PHC centres, the clinical supervisors, and the staff at the PHC centres in Salem, Boo, Gustavsberg, Ekerö, Huddinge, Hässelby, and Flemingsberg, and the medical students and the patients for their participation and for making this study and subsequent article possible.

Funding

This work was supported by grants provided by Region Stockholm (ALF project) Grant no: 20150769. The funding agency had no role in the design of the study, collection, analysis, and interpretation of the data or in writing the manuscript. Open Access funding provided by Karolinska Institute.

Author information

Authors and Affiliations

Authors

Contributions

KB, TS, GN and CL participated in the study design. KB collected all the data, KB and CL conducted the qualitative analysis and HA conducted statistical analysis, KB contributed to the statistical analysis. All authors contributed to writing the manuscript and all authors read and approved the final manuscript.

Corresponding author

Correspondence to Karin Björklund.

Ethics declarations

Ethics approval and consent to participate

The study was approved by the Regional Ethical Review Board in Stockholm (Dno: EPN 2017–1574-31-1). All participants who were included gave written consent.

Consent for publication

All included participants gave written consent.

Competing interests

The authors report no conflicts of interest in this study.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Björklund, K., Stenfors, T., Nilsson, G.H. et al. Let’s ask the patient – composition and validation of a questionnaire for patients’ feedback to medical students. BMC Med Educ 21, 269 (2021). https://doi.org/10.1186/s12909-021-02683-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12909-021-02683-y

Keywords