Skip to main content

Table 2 Evaluation guide for faculty development program in educational effectiveness

From: A guide to best practice in faculty development for health professions schools: a qualitative analysis

Evaluation question

Indicators

Data Sources

Data Collection Method

Domain A: Context

A1- Has the context of the training been well defined?

• Provide a description of the training context in printed and/or online format

• Program specification/ Faculty guide/ Brochures

• Surveys/ Website

• Document review

• Website Review

• Survey Review

A2- Is the context described in the faculty development program description?

• Provide an orientation of the program context to the trainees

• Faculty guide/ Brochures/ Program specification/

A3-Does the context identify the potential target audience?

• The context is specifically designed with the target audience in mind.

• There is a description of the intended target audience in the program specifications.

• Percentage of trainees that see that the program meets their needs.

• The FDP mission and vision and objective statements

• Preamble of the course/ Program specs/ Brochures/ Faculty guide

A4-Does the context identify the specific need or situation necessitating the training?

• There is a description of the specific need or situation in the program specifications

• Survey for a needs assessment.

A5-Does the context identify the place and time?

• Description of the place and the program’s timeframe in the program specification

• Preamble of the course/ Program specs/ Brochures/ Faculty guide

Domain B: Faculty

B1-Are the faculty selected for the program identified?

• Presence of admission criteria with a clear description of the target audience

• Program specifications

• Document review

• Surveys

B2-Are the faculty selected for the program stratified according to their knowledge?

• Presence of training program pre-requisite

• Faculty guide/ Brochures

B3-Are the faculty selected for the program stratified according to interest?

• Survey the trainees and trainers’ interests upon admission/registration.

• Compare group allocation form with the registration forms

B4- Is the selection of the trainees for the program homogenous in terms of knowledge and interest?

• Review attendance sheets (Registered Vs attended)

• Compare the attendance list and registration form

B5- Is there a degree of heterogeneity employed in the selection of the trainees?

• Presence of training program pre-requisites indicating a wide range of variables (sex, race, country, specialty.)

• Program Specifications

Domain C: Needs

C1-Have the trainee needs been studied?

• Trainees’ knowledge gaps and training requirements were identified as per the literature review.

• Percentage of trainees expressing willingness to attend FDP in the ‘identified topic.’

• Percentage of trainees mentioning this topic in their Personal development plan

• Relevant literature articles

• Documentation of faculty needs assessment.

• questionnaire

• Faculty members personal development plans

• Document review

• Review of media files

• survey

• FGD / Interview with trainees

C2- Have the institutional needs been studied?

• Quality assurance report suggests that this topic needs improvement.

• A review of the literature reveals that institutions need to train their faculty in the “identified.

• Leadership/administrators/curriculum committee / medical education/quality assurance believe that there is scope for improvement in the ‘identified domain’ and recommend FDP

• Quality / accreditation report

• Documentation of relevant literature review/ Soft or hard copy of relevant journals

• Documentation of institutional needs assessment questionnaire/

• Expressed oral and written opinions of Leadership/administrators/curriculum committee / medical education/quality assurance/ Documentation of ‘learner’ needs assessment questionnaire.

C3-Have the identified trainee and their institutional needs been prioritized?

• Percentage of dissatisfaction from trainees regarding this identified topic/domain.

• Percentage of trainees and administrators who believe that these tasks/contents/training in the ‘identified topic’ should be given high priority

• Documentation of ‘prioritization’ based on the Data sources of C1 and C2/ FDP schedule/ Brochure.

• Trainee and administrators’ feedback/satisfaction questionnaire

C4-Have the identified trainee and their institutional needs been reflected on the content and methods of training?

• Percentage of identified trainees and their institutional needs added as contents with appropriate tasks and methods for training in the FDP schedule.

• The proportion of experts who agree that trainee and their institutional needs been reflected on the content and methods of training

• Teaching materials/handout/ Recording of FGD with experts/

• External reviewer report

Domain D: Objectives

D1-Are there defined objectives for the training?

• Expected outcomes/contents of the FDP are mentioned as well-defined objectives.

• The proportion of experts who agree that objectives are well defined for the training.

• FDP schedule/ Brochure/ Reading materials / Handouts

• Recording of FGD with experts/ External reviewer report.

Document review

FGD

FGD with experts

(Comparison of FDP schedule, FDP with results of faculty needs assessment/literature review / institutional needs assessment)

D2-Are the objectives SMART?

• The proportion of experts who agree that objectives are specific, measurable, achievable, (or agreeable), realistic (or relevant) and time-bound, (or timely)

• Percentage of program organizers who agree that the objectives were SMART.

• FDP schedule/ Brochure/ Reading materials / Handouts/ External reviewer report/ Recording of FGD with experts

• Analysis of Feedback questionnaire.

D3-Are the objectives aligned with any of the identified needs?

• Percentage of trainees/administrators who agree that identified objectives are aligned with either trainee or their institutional needs.

• The proportion of experts who agree that trainee and their institutional needs been reflected on the content and methods of training

• Trainee and administrator questionnaire with analysis reports

• FDP schedule/ Brochure/ Documentation of faculty needs assessment questionnaire with analysis/ Documentation of institutional needs assessment/ Recording of FGD with experts/ Inter-rater analysis of experts.

D4- Are there objectives that deal with trainee soft skills?

• Percentage of identified objectives that are dedicated to soft skills of the trainees (Under regular circumstances)

• Percentage of adapted objectives that deal with trainee soft skills (Under special circumstances)

• Analysis of survey from trainees / resource faculty

• /administration/ FDP schedule with contents/ Teaching materials / handout / Analysis of Expert opinion

Domain E: Materials

E1-Are there materials for the training?

• There is the availability of pre-reading materials, timetables and schedules are provided.

• FDP content/ Lesson outlines/ Brochures/ Timetables

• Interview

• Document review

• Survey

E2-Are the materials authentic?

• Materials are tailored to the institution’s and trainees’ demands.

• Materials are suitable to the context of the institute, culture, and country.

• Literature review (on authentic resource material)/ Guidelines of the institute/ Need’s assessment report

• FDP program content

E3-Are the materials in proper format?

• The program is well structured with proper learning objectives and timelines.

• Lesson outlines/ Study guides/ Trainee interviews/ Guideline from literature review/accreditation bodies

E4-Are the materials adequate for the training content?

• Materials are found sufficient to cover the domain of FDP e.g., Teaching and learning /Leadership/ Workplace-based assessment etc.

• FGD of facilitators

• External reviewer report/ End of program trainee survey/ End of program trainer survey

Domain F: Methods

F1-Are the instruction methods planned?

• Instruction methods are well described.

• Lesson plans/ Questionnaires to the trainees/ Peer observation/ Trainee interviews

• FDP program syllabus

• Documents Review

• Survey

• Observation

• Digital data review

• Interviews

• Document review

F2-Are there proper guides for instruction?

• There is a document guiding students about the outline of the instruction.

• Lesson outlines/ Study guides

F3-Are the instruction methods suitable for the content and objectives?

• There is a variety of instruction material that delivers the content most efficiently in the opinion of experts, trainers and trainees

• Lesson outlines/ Study guides

F4-Are the instruction methods suitable for the trainees?

• Percentage of trainees who pass the attainment level of the program.

• More than 70% of the trainees are satisfied with the instruction methods

• Student assessment result/assignment results

• Collection of expectations of the trainee at beginning of the session and matching with the objectives detailed throughout the session

• (Surveys /Discussions Teaching Learning Conversation).

• A study with constructive alignment in planned, delivered and assessed material.

• Student end of program reports

• Student satisfaction surveys.

F5-Are there innovative instruction methods in the program?

• Innovative methods such as different approaches like gamification, TBL, role play, Case-based learning etc. are present.

• FDP brochure/ promotion from the institute / Software used.

• Interview the participants.

• Comparison study of innovation and previous program methodology

• Brainstorming and group discussion

F6-Are the instruction methods feasible?

• Instruction methods are found feasible by external reviewers.

• Percentage of instruction methods reported that are performed

• Report of external reviewers/

• Trainee and trainer feedback

F7-Is the program longitudinal?

• The program runs longitudinally for more than 3 months with an opportunity for self-study and structured assignments.

• Syllabus/ Faculty guides

Domain G: Learning oversight

G1- Is there a functional process to enable follow up of the learning?

• Percentage of the trainees passing the formative assessment.

• Two-three formative assessment exams are conducted each module.

• Improvement of the student performance

• Trainee reflections are collected at fixed intervals

• Records of the training sessions

• Reflection reports

• Mentor report and self-assessment report

• Pre and post-test results

• Document review

• Surveys

• Observation

• Statistical analysis

• Assessor evaluation checklist

• Questionnaire

• Focus group

• website review

G2-Is this mechanism adequate to the objectives?

• Trainees’ perception of the concepts indicates that the mechanism is adequate.

• Percentage of the non- attaining Trainees diagnosed annually.

• Percentage of the procedural defects detected by this mechanism.

• Trainees’ feedback

• Audit report

G3-Is this mechanism known to everyone in the program (management, faculty, learners, administration)?

• Percentage of trainees and administrators who received the announcement and program details.

• Percentage of the student accessing the website and knowing the mechanism.

• Percentage of students’ satisfaction with the mechanism.

• Use of all the available communication channels emails, brochures, social media platforms.

• Emails and brochures

• Website metrics

• Questionnaire results

• Emails, brochures, social media groups

G4-Are there functional measurement tools to evaluate the learning and skill acquisition?

• There are differentiating assessment tools to assess learning

• Ensure the validity and reliability (Psychometrics measures) through:

• Multiple tools

• Multiple occasions

• Multiple assessors (external assessors)

G5-Have the program ILOs been reached?

• The student success rate in assessments and post-tests

• Student satisfaction feedback questionnaires and percentage of students agreeing that the ILOs have been achieved.

• Post evaluation quiz Statistical analysis report

• Questionnaire results

G6-Is there a method to assess the ILOs?

• There is a program post-test or program evaluation that demonstrates learners’ achievement

• Post-test results

• Program Evaluation report

G7-Is there a methodology to deal with the non-attaining learners?

• Percentage of the non-attaining learners that have undergone a remedial procedure.

• Percentage of the trainee informed and aware of the remedial policy

• An Authorized policy is announced to the trainee

• Frequency of evaluation measures to detect the non- attaining learners

• Mark list

• Learner feedback

• PDF brochure/ website

• Evaluation reports

Domain H: Community of practice

H1-Is there a platform to allow for building the community?

• There is a platform that is user friendly, flexible and allows for communication between trainees.

• Platform dashboard

• Trainee and trainer feedback

• Observation

• Surveys

• Document review

• Digital review

H2-Is there time allocated in the program to allow for building the community?

• Percentage of time allocated for activities established to promote community building

• Program specifications/ schedules

H3-Are there designated activities to allow for building the community?

• Presence of activity moderators/ Facilitator to help them build the community

• Program report

H4-Do trainees have enough knowledge of other trainees?

• Activities allocated for community building are innovative.

• Availability of trainee information on platforms and/or in printed format

• Website

H5-Are there collaborative efforts between trainees?

• Percentage of trainees that built a relationship with other trainees (Projects, publications, social media friendship or social activities).

• Survey

• Publications

• Social media

• Project proposals

H6-Are there enough collaborative project outcomes with trainees as project members (publications, conferences, workshops…etc.)

• The number of collaborative projects established between members in each group.

• The number of joint activities between trainees yearly (conferences, publications etc.)

• Impact evaluation of joint activities

• Surveys

• Annual alumni reports

• Impact evaluation report

Domain I: KPI

I1- Has the program achieved growth over the years? (Number of attendees, learner satisfaction, learner attainment, measurable impact on teaching/ learning/ assessment…etc.)

o Number of attendees

• An annual increase in the number of trainees attending the program

• An annual increase in the number of trainees applying to attend the program

• Percentage of the increase in the number of trained trainees compared to non-trained faculty members annually

• Trainee satisfaction

• Average of trainees’ satisfaction rate with the activities of the training program on a five-point scale in the program evaluation survey

• Trainee attainment

• Increase in the proportion of trainees who

• complete the program in minimum time.

• Increase in the proportion of trainees passing the program annually

• Improvement in scores of the trainees in the post-program assessment than pre-program assessment

• Dropout rate/ total program

• Number of complaints/ year

• Recommendation of the program

• Measurable impact on teaching, learning and assessment

• Percentage of trainees who graduated from the program who were appointed in leadership positions

• Percentage of graduates promoted

• Improvement of skill the of graduates in the workplace

• An official document with the number of trainees entering the program annually

• An official document with the number of trainees graduating from the program for one batch

• FG recordings

• Feedback from colleagues and students

• Observation

• Self-assessment questionnaires

• Trainee survey

• FGD / Interview with trainees

• Document review

• Statistical data analysis

I2- Are there established methods to measure the KPIs?

• Valid and reliable established methods for measuring KPI

• Timely and continuous measuring of the KPI.

• Evaluation Reports

• Annual reports

• Data collection tools

I3- Is there a dedicated team for measuring the KPIs?

• A dedicated and professional team for measuring each of the KPI is appointed

• Appointment decree for the team

I4- Is there enough data collected?

• Adequate data collection for measuring each of the KPIs

• Documents

• Records

• Statistical data

I5- Is the data properly analyzed?

• Proper analysis of the data using suitable statistical methods for all KPIs

• Documents

• Records

• Statistical data

I6- Is the information deduced from the data properly reported/ discussed?

• 80% of the information deduced from the data properly reported/ discussed

• Increase in the number of Scientific council meetings that discuss the deducted information properly

• Meeting minutes of the scientific councils

I7-Are there corrective actions taken based on the information deduced?

• Presence of proof of corrective action taken in response to assessment results. This can be a change in the scope, structure or content of the program.

• Program report

Domain J: Feedback

J1- Has the feedback improved over the years? (Student satisfaction/ faculty satisfaction/ student attainment)

Trainee satisfaction

• An annual increase in the satisfaction rate of trainees, faculty and administration of 10%

Trainee attainments

• Percentage of the trainee who passed the course improved by 10%

• Surveys

• FG and interviews

• Post-training quizzes

• Focus groups

• Interviews

• Statistical analysis

• Document review

• Observation

J2- Are there established methods to measure the learner and trainer feedback?

• There are valid and reliable established methods for measuring feedback (end of program surveys, focus groups, reflection meetings)

• Timely and continuous measuring of the feedback

• Report from external program reviewers

• Data sets available from the feedback

J3- Is there a dedicated team for measuring the learner and trainer feedback?

• A dedicated and professional team for measuring each of the feedback

• Appointment decree for the team

J4- Is there enough data collected?

• There exists at least one type of data set for each KPI

• Data repositories for the program

J5- Is the data properly analyzed?

• Data is analyzed using a well-established data analysis program

• Programs existing on the computers where data repositories are present

• Data repository formats

J6- Is the information deducted from the data properly reported/ discussed?

• The information deducted from the data properly reported/ discussed in the relevant scientific committees

• Minutes of meeting of relevant scientific committees

J7- Are there corrective actions taken based on the information deduced?

• At least one annual corrective action can be demonstrated

• Program report

• Program specification of the upcoming training round