Skip to main content

A mixed-methods study of the effectiveness and perceptions of a course design institute for health science educators

Abstract

Background

Most health care professionals get their start in academics without formal teaching training. As such, institutions encourage participation in opportunities to address gaps in faculty’s knowledge of pedagogy and learning theory in order to promote both successful student and patient outcomes. This study aimed to examine the reception of a faculty development program focused on teaching participants the basics of course design.

Methods

Applying a mixed-method approach, this retrospective study used pre/post-tests, assignment grades, self-assessment questionnaires, and focus groups to elucidate the impact of the faculty development intervention on course design. The participants (n = 12) were health educators from a private all-graduate level university with campus locations across the United States, including in the Southwest and Midwest. In the Course Design Institute (CDI), the participating faculty learned evidence-based instructional approaches and techniques to implement contemporary teaching practices.

Results

The data from the pre/post-tests and focus groups suggest that participants learned about topics including instructional alignment, learning goals and objectives, instructional strategies, assessment planning, feedback approaches, communicating expectations, and adult learning theories by participating in this course. The final deliverable scores indicate that the CDI graduates were able to apply a backward design process to plan their own instruction. Data from both the survey and the focus groups suggest that participants were satisfied with the experience and particularly appreciated that the course was relevant to them as educators in the health sciences.

Conclusions

The results of this study indicate that the CDI was influential in developing the faculty’s knowledge of the course design process, promoted the application of course design and pedagogy skills amongst CDI graduates, and positively impacted self-reported attitudes about their teaching abilities. In addition, feedback from participants indicates that they recognized the value of this program in their own development and they believed it should be a required course for all educators at the institution.

Peer Review reports

Background

Issues in Healthcare Education

Healthcare practitioners who choose to take on faculty roles in higher education are valuable mentors due in part to their strong clinical skills and diverse workplace experiences. The assumption is that these new educators will be immediately prepared to transfer their expertise to the next generation of professionals. However, it is generally known that there is a difference in the skills required to be an expert and to teach others to become experts. In fact, many clinicians begin a career in academics without formal training on how to teach; often finding themselves unprepared for the challenges of their new roles as educators [1,2,3,4].

A lack of teaching skills has significant consequences for students and faculty members. Students may be adversely affected in their learning experiences, especially in the development of critical thinking and problem-solving skills, which current educational standards require to be demonstrated within the educational process [5,6,7,8]. Furthermore, in order to help students efficiently and effectively develop essential clinical and other desirable skills, faculty must understand the cognitive process and utilize current instructional methods. The lack of pedagogical training for new faculty may also affect their job satisfaction and retention; leading to frustration and burnout [9,10,11]. Health science faculty may face significant challenges when learning how to teach on the job and through trial and error [9, 12]. Without support from faculty development programs, educators may struggle to adopt contemporary pedagogical methods in their classrooms [13]. More specifically, they often fail to establish comprehensive learning goals that are tightly aligned with clear and measurable learning objectives [14]. They may also struggle to understand the implications of critical situational factors (elements and factors of the learning situation such as the number of students, the time and duration of the class, the class subject, characteristics of the learners and instructor(s), and expectations of external groups including accreditation organizations) and their influence on the learning context of their courses which can inadvertently create barriers in their student’s learning experience [15]. Faculty without foundational knowledge in teaching and learning often misalign assessments with the learning objectives, widening the gap for students to apply what they intend to learn in authentic settings [11]. Without connecting the assessments with learning objectives, it is unlikely that instructors will be able to determine what knowledge and skills students have gained from completing the course. Similarly, faculty may overlook the issues caused by not developing the appropriate learning activities to support the achievement of the stated learning objectives [7, 10]. As a result, students are often left on their own to learn the content. Or, even more detrimental to students’ learning, the instructional activities might not adequately prepare them for high-stakes assessments [16]. In sum, the basic formula for effective instruction – the intentional alignment between the learning goal and objectives, learning activities, and assessments (both formative and summative) – is often missed.

Faculty Development in Healthcare Education

To address gaps in health sciences faculty’s formal training in pedagogy and learning theory and to promote both successful student and patient outcomes, departments, institutions, and intramural organizations have created faculty development programs [5, 17,18,19,20]. These exist across a broad spectrum ranging from mandatory to voluntary, short-term (a single event or short series of events) to long-term (longer than a year), and may be either discipline-specific or interdisciplinary [11, 18, 19, 21,22,23]. While the broader goals of these programs remain relatively consistent, the specific objectives may vary. Common themes include improving teaching effectiveness and promoting (both general and specific) learner-centered instructional approaches [5, 17, 18, 22], designing courses and developing curricula [17], providing feedback to students [17], and establishing faculty learning communities (FLCs) [5, 23]. McLean et al. described over a decade ago that there had been a progressive shift within the health sciences towards the rigorous evaluation of faculty development programs [19]. Common outcomes have included increased faculty confidence, use of student-centered approaches, empathetic instruction, and reflective teaching [18, 22, 24]. Although it is difficult to determine the precise impact of faculty development programs on the long-term development of educators’ skills, student learning and retention, and ultimately patient outcomes, the effects of faculty development have nonetheless been described through both qualitative and quantitative data and may be most profound for educators early on in their transition from clinic to the classroom [11, 19, 21, 22, 24].

Training Faculty on the Course Design Process

One faculty development opportunity that has been implemented across a variety of institutions is an intervention often referred to as a Course Design Institute (CDI). A CDI typically provides a practical learning experience to introduce faculty to the principles of backward design. The motivation is that establishing a foundation in course design principles can empower educators to design effective courses that enable them to achieve the ultimate goal of delivering significant and impactful learning experiences for students [15, 25].

While the format of the CDI is often customized to the particular university or program, it is typically designed to offer an iterative, structured experience whereby participants learn about course design processes and principles. Over multiple days or weeks, a cohort of educators follows established methods to craft measurable learning objectives, design assessments and content, and select instructional feedback approaches to help students achieve the targeted course goals [13, 14, 26]. The CDI also provides opportunities for the participating educators to brainstorm and discuss best practices together and to receive feedback from their colleagues and the leaders of the CDI (who may be faculty, instructional designers, or other teaching and learning experts).

Educational Theoretical Framework

Backwards design is an effective and widely used instructional design method that is often taught in CDIs [25]. Put simply, backwards design follows a three-stage process by which courses are designed by starting with intended outcomes and working backward from there. The stages, in order, are (1) identifying desired results, (2) determining acceptable evidence, and (3) planning learning experiences and instruction [25]. Though this concept is not new, Wiggins and McTighe formalized this model in their seminal work Understanding by Design [25]. They emphasized developing an understanding of “big ideas,” which they defined as “a concept, theme, or issue that gives meaning and connection to discrete facts and skills” (p. 5) [25]. They held that when instruction was focused on big ideas, it would center on the learner (i.e., what do students know?) rather than the instructor (i.e., what do I teach?) [25].

For this reason, backward design has been widely used and adopted in many contexts including health sciences education [6, 15, 27, 28]. For example, Emory described how a backward design approach was employed at the University of Arkansas to transform a nursing course that taught students how to apply medical concepts to professional practice [27]. Just as the backwards design process has been used to develop individual courses, it has also been implemented when planning instruction at the program level. For example, Wright et al. found that the backward design approach was helpful when redesigning the pharmacy education program at Auburn University [28].

Overview of the Present Study

The purpose of this study was to evaluate the efficacy and impact of a Course Design Institute (CDI) training program implemented at a medium-sized health science university in the United States. The CDI, which was offered three times between May 2020 and August 2021, was customized to meet faculty development needs and to provide guidance on effective course design. Specifically, this study was designed to elucidate (1) what new knowledge faculty acquired about course design, (2) what skills related to course design and pedagogy were developed, and (3) faculty’s attitudes related to their teaching practices and their experiences participating in the CDI program.

Methods

Applying a mixed-method approach, this retrospective study used performance scores, perception data, and focus groups to assess the impact of a faculty development intervention focused on course design. In addition, quantitative data from learning tests, assignment grades, and self-assessment questionnaires were analyzed in order to determine participants’ course design knowledge, skills, and attitudes. Finally, qualitative data were also analyzed for themes related to the reception and outcomes of the faculty development program.

Overview of the Course Design Institute (CDI)

The CDI model described herein was adapted from existing programs [13, 14, 26, 29] and sought to provide a professional development opportunity to an interdisciplinary cohort of health educators employed by a small all-graduate level health sciences university. The course spanned seven weeks and participants engaged in synchronous online meetings via Zoom video conferencing (Zoom Video Communications; San Jose, California) for 1.5 hours per week. All participants applied to be accepted into this voluntary course regardless of their prior teaching experience, course length, and instructional teaching modality (in-person, online, hybrid, clinical). Instructional designers and faculty development experts taught the CDI and took a systematic and facilitated approach to introduce the participants to the subject of learner-centered, backward course design. Weekly topics included creating measurable learning goals and objectives, selecting appropriate instructional strategies, creating a plan for instructional content, aligning assessments, and giving and receiving feedback. Participants were assigned weekly homework activities such as quizzes, discussions, and writing assignments to reinforce the course design concepts. For the culminating activity, each participant completed a learning artifact called the Course Design Blueprint to demonstrate their course design skills (Additional file 1). Participants iteratively developed their blueprint which served as a comprehensive proposal to communicate the educator’s plan for their course and the methods by which they sought to promote student learning outcomes. As they developed their course design plan, each member of the CDI cohort received personalized feedback from their peers and the course instructors. Additional long-term goals of the CDI were for participants to be able to repeat the process, utilize the backwards design framework when designing other instruction, evaluate the components of their teaching and learning practices, and make evidence-based pedagogical decisions to best support their learners.

Participant Recruitment and Research Study Process

To explore the impacts of the CDI program on participants’ knowledge, skills, and attitudes, graduates of the program across the three cohorts from Summer 2020 to Summer 2021 were invited via e-mail to enroll in the IRB-approved study (IRB #2022–060). The only requirement for inclusion was the completion of the CDI. CDI participants who started the course but did not successfully finish it were excluded because the amount of the course they experienced varied significantly. The recruitment materials explained the purpose of the study, risks and benefits, and compensation. Those who enrolled were provided links to participate in a post-course research survey (Qualtrics; Provo, UT) and an online synchronous focus group. Study participants were also informed that educational artifacts they had previously submitted during the CDI would be evaluated for this study. Twelve CDI graduates enrolled and answered the research survey, and from this participant pool, ten chose to attend a focus group session. The demographic information of the participants is presented in Table 1.

Table 1 Summary of demographic information of participants

Materials and Instruments

Pre−/Post-Tests Examining Pedagogical Knowledge Gains

To examine the impact of the CDI on the participating educators, a pre-post style assessment was administered which measured knowledge gained as a result of participating in the course. The tests were delivered immediately before and after the course and aimed to evaluate participants’ grasp of foundational course design concepts. The knowledge-based questions were in the form of multiple-choice, fill-in-the-blank, and short-answer questions. Additional questions asked about perceptions of self-efficacy, particularly as it related to their ability to design and facilitate a learning-centered, evidence-based course. This approach, including the incorporation of self-reported outcomes, has been extensively used in teaching and learning studies to measure changes in knowledge, attitudes, and behaviors and has also been previously used to evaluate the consequences of faculty development programs [13, 30].

Course Design Blueprint Scores

Participants’ final scores on their culminating project (the Course Design Blueprint) were evaluated to measure course design skills developed by the end of the CDI. The course instructors assessed these documents following a predetermined grading rubric (maximum score = 20 points; Additional file 1) which reflected evidence-based principles of sound course design.

Teaching Appraisal Inventory

A modified version of the Teaching Appraisal Inventory (TAI) used by Palmer et al. [13] was administered through the post-course research survey to assess the participants’ self-efficacy toward teaching concepts. The TAI was first developed by Balam [31] as a 46-item instrument aimed at uncovering instructors’ confidence in classroom teaching practices. Each item asked participants to rate their perceived confidence in a statement starting with “I think I can …” followed by a common classroom teaching practice on a 7-point Likert-type scale (i.e., 1 = Strongly disagree, 2 = Disagree, 3 = Somewhat disagree, 4 = Neither agree nor disagree, 5 = Somewhat agree, 6 = Agree, 7 = Strongly agree). Palmer et al. modified the instrument to group the 46 practices into seven overarching categories, used as subscales for measurement: goals and objectives, assessment, classroom management, learning activities, class facilitation, effective assignments, and overall teaching [13]. The version of the TAI used in the present study was modified slightly to reflect the content of the present CDI (Additional file 2). Nevertheless, the instrument used the same categories identified by Palmer et al. [13].

CDI Satisfaction

The post-course research survey included a combination of two Likert-type, two open-ended questions to assess participants’ satisfaction with the CDI, and demographic questions. The ratings from the Likert-type questions were used to measure how well participants enjoyed the class. Additionally, responses to the open-ended questions were analyzed by two authors (QC and BW) and coded for themes of what participants enjoyed most and least about the class.

Focus Groups

Focus groups were utilized to assess participants’ attitudes toward implementing course design principles into their teaching practice [11, 21, 32]. The three focus groups were comprised of 2–5 participants each, grouped by availability rather than discipline or department, and were conducted using the virtual conferencing platform Zoom [11, 33, 34]. All focus groups were led by the same individuals (JS and DT) and followed identical protocols. Before the focus group, participants were instructed to review their course design blueprints from the CDI to prepare for the discussion. Upon entry into the Zoom room, participants were briefed on the purpose of the focus group and guidelines for the discussion. They were also informed that the session would be recorded for research purposes. Authors (JS and DT) then facilitated a 50–60-minute discussion, starting with general questions that became more specific throughout the session; follow-up questions were asked when needed to prompt participants to expand on their thoughts or experiences [34, 35]. The focus group questions (Additional file 2) were designed to promote discussions about faculty’s teaching experiences since the CDI, how the CDI has informed their teaching practice, and the opportunities and barriers faced when making changes to the design of their courses.

Transcripts of the recorded focus groups were auto-generated via Zoom and inspected and edited for accuracy. The data were then analyzed using a descriptive coding technique using Atlas.ti software (ATLAS.ti Scientific Software Development GmbH; Berlin, Germany) to identify emergent themes and assign responses to one or more respective categories [11, 36]. Three research team members (QC, JS, DT) independently conducted a first pass of the coding and then examined their results for agreement. A codebook was then developed and used to complete the qualitative data. Anonymized quotes were also identified that corresponded to particular themes and provided more context for the responses [13, 32, 37].

Data Visualization and Statistical Analysis

Unless otherwise noted, the data were analyzed and visualized using GraphPad Prism (version 9.4.1; San Diego, CA). One-way unpaired t-tests were used to evaluate the hypothesis that participation in the CDI would contribute to an increase in knowledge related to teaching and learning topics. Descriptive statistics including the average blueprint score and range amongst the participants were calculated to examine the ability of CDI graduates to apply the backwards design process to their own instruction. Cronbach’s alpha analysis was conducted in SPSS to determine the internal consistency of the responses to the modified TAI instrument and mean and standard deviations of scores in the sub-scales were calculated to analyze the self-efficacy data. Furthermore, the mean and standard deviations to the Likert-type survey questions were calculated to quantify these numerical results. Lastly, intraclass correlations (ICC) were calculated in Excel software (Microsoft, Redmond, WA) to determine the interrater reliability of the categorization of the open-ended survey question and focus group responses [38]. The ICC technique was used as it is an approach for two or more raters without absolute agreement [39].

Results

Pre- and Post-tests

The results from the course knowledge assessment, as measured by a pre- and post-test (Fig. 1), indicated statistically significant learning gains due to participation in the CDI. Immediately following the course, respondents showed increased ratings of their own knowledge of course design principles (Fig. 1 a, p= 0.03) as well as an increased number of correct responses to multiple choice questions related to pedagogy (i.e., “What is the difference between formative assessment and summative assessment?”; Fig. 1 b, p = 0.01).

Fig. 1
figure 1

Results of pre- and post-tests which contained Likert-type questions about perceptions of participants’ course design knowledge (A) and multiple-choice questions on pedagogical principles (B). For both plots, bars = mean ± standard deviation, each dot = data from 1 participant, * p < 0.05; unpaired t-test

Course Design Skills

Amongst CDI participants who completed the program, as measured by the course design blueprint (Fig. 2 a, Additional file 1), the course design skills scores ranged from 13 to 19.50. The average score was 16.57 (SD = 2.30) with 42% of the scores falling between 17 and 19.50 (Fig. 2 b).

Fig. 2
figure 2

The Course Design Blueprint contained sections that helped scaffold the course design process and contained prompting questions to which the participants responded (A). The final blueprints were graded using a rubric with a maximum score of 20 (n = 12; B)

Teaching Appraisal Inventory as an Assessment of Self-efficacy

On a scale from 1 (strongly disagree) - 7 (strongly agree), participants’ perceived self-efficacy in their classroom teaching practices averaged 6.27 (SD = 0.26; Fig. 3 and Additional file 3). The classroom environment subscale had the highest average score of 6.45 (SD = 0.44), while the assessment subscale had the lowest average score of 6.08 (SD = 0.20). A high overall Cronbach’s alpha (α = 0.96) determined the strong reliability of this instrument in the context of this study.

Fig. 3
figure 3

A Teaching Appraisal Inventory (TAI) was used to measure self-efficacy across 7 sub-scales. Bars = mean ± standard deviation, each dot = data from 1 CDI participant

Course Satisfaction

One hundred percent of participants indicated that they would recommend the course to other faculty and that they found the course to be helpful in learning new knowledge and skills (Fig. 4a). On a scale of 1 (strongly disagree) to 7 (strongly agree), the average responses to these statements, “I would highly recommend this course to other faculty” and “This course was helpful in enhancing my knowledge and skills in developing an evidence-based practice of course design” were 6.67 and 6.58 respectively.

Fig. 4
figure 4

Responses (n = 12) to Likert-style questions on course satisfaction (response options ranged from strongly disagree to strongly agree (A). Responses (n = 12) to open-ended questions on the most and least enjoyed elements of the CDI were coded into categories (B; bar = mean ± standard deviation, each dot = number of quotes assigned by each coder)

Data from open response questions were coded by two authors (ICC = 0.54) and indicated the aspects of the course that participants liked most and least (Fig. 4 a, Table 2). In particular, participants indicated they enjoyed the instructional approach used to facilitate the CDI and found many in-class activities, resources, and instructional support to benefit them. In contrast, participants shared how they least enjoyed other instructional activities such as discussion boards, how they encountered scheduling conflicts or external barriers to their learning process, and preferences related to the course format, particularly the duration and mode of the class.

Table 2 Responses to open-ended questions, “What did you enjoy the most about the CDI?” and “What did you enjoy the least about the CDI?”

Focus Group

Qualitative feedback from the focus group participants was coded by three authors (ICC = 0.92) and was found to fall into four major categories: 1) the impact of the CDI course on the participants themselves, 2) the impact of the CDI on others, 3) the perception of the CDI, and 4) barriers encountered when applying course design skills learned in the CDI (Fig. 5, Table 3).

Fig. 5
figure 5

Transcripts from the focus groups were examined for common themes and quotes were coded into thematic categories (bar = mean ± standard deviation, each dot = number of quotes assigned to a given category by each coder)

Table 3 Example quotes by thematic categories from the focus group responses

On average, across the raters, the most common responses were attributed to describing the impact of the CDI on participants. Amongst these quotes, the majority pertained to new course design skills or principles learned and practiced during the CDI. In particular, participants described learning about the central course design principles related to defining situational factors, writing learning objectives, aligning summative and formative assessments with content delivery, the value of instructional feedback, and incorporating educational technologies. Respondents also indicated that they gained skills and confidence in articulating, justifying, and advocating for student-centered learning practices, which they used in communicating with their supervisors, curriculum committees, or learners. Participants also described that engaging in the CDI helped them see their courses from their learners’ perspectives and to make course design decisions based on that knowledge. These responses demonstrate ways the CDI was perceived as contributing to professional development and helped foster a sense of personal accomplishment and fulfillment.

Faculty responses also indicated ways in which the CDI elicited effects that were translated to other courses and stakeholders (Table 3). For example, many of the responses on this theme described how CDI participants translated the course design skills they learned during the program to other courses they teach or how they taught their colleagues to use the course design process they learned. Additionally, participants reflected on how the CDI provided an opportunity for them to receive and give peer feedback and share practices used in their courses and disciplines with colleagues across diverse fields. Not only were the effects of the CDI felt by faculty, but respondents also shared how the knowledge they gained in the CDI contributed to improved student experiences. These outcomes ranged from increased engagement and knowledge retention to a sense of community and promotion of self-directed learning. Another collection of responses in this category pertained to CDI graduates recommending this course to and for other educators, with participants often commenting that the CDI should be required for all educators and curriculum committee members.

These positive recommendations of the CDI by participants were often accompanied by a discussion of how the CDI fit into previous faculty development opportunities they had encountered. For example, participants indicated that the CDI was valuable to those who had undergone a range of training - from prior workshops or seminars to post-professional certificate programs and advanced degrees in education (MEd or EdD). However, respondents also described opportunities for continuous improvement of the CDI. Common suggestions centered on the timing and schedule of the course as well as the need to prepare applicants with expectations for the program’s rigor. The latter of which was commonly cited in conjunction with a description of the other demands on the faculty’s time and attention. Balancing job responsibilities was also discussed by faculty in relation to whether or not they were able to translate their blueprint into action following the CDI. While many respondents indicated that they implemented their course design plan partially or in full, other responses suggested that some faculty encountered barriers (Table 3). The most commonly cited barrier was human capital; in particular, faculty felt they were unable to implement their course design ideas due to time constraints, staffing shortages, and access to technical support. Another barrier was related to budgetary constraints. However, several faculty also indicated that their plans were met with resistance from their department (supervisors, curriculum committees, course directors, etc.). This may have been due to general hesitance around change (Table 3) or some push-back around ideas that had been attempted previously (at least in part) and may not have been found to have the desired impact during the first attempt.

Discussion

Faculty development opportunities are a commonly requested service across disciplines and can be particularly helpful for new academics and instructors transitioning from the clinic to the classroom [3, 17, 19, 32, 40, 41]. In response, the CDI model described in the present study drew upon the prior literature related to faculty development programs along with the theories of spaced repetition [5, 42,43,44] and situated learning [45,46,47] to teach medical educators the process of backwards design through a 7-week intervention where each participant applied their learning directly to their course planning. Here the program’s impact on the knowledge, skills, and attitudes of the participating faculty members was explored.

Knowledge

The participants in this study generally entered the CDI with extensive disciplinary training but a wide range of teaching experience and formal lessons in pedagogy. Several participants indicated that they had engaged in prior faculty development or educational training (including but not limited to workshops, seminars, post-professional certificates in clinical education, and graduate-level programs in education). Nevertheless, the data from the pre−/post-tests and focus groups suggest that participants learned or deepened their understanding of topics including instructional alignment, learning goals and objectives, instructional strategies, assessment planning, feedback strategies, communication of expectations, and adult learning theories by participating in this course. These data corroborate results from other studies which have demonstrated that faculty development programs can effectively increase educators’ knowledge [48,49,50,51].

Skills

Developing a foundation in pedagogy is a critical first step in empowering educators to design and implement student-centered instruction. Participants iteratively developed their course design plan throughout the CDI by working through a provided template (Course Design Blueprint) and incorporating feedback from peers and instructors. The final blueprint scores indicate that the CDI graduates could apply a backwards design process to their own courses. The range in scores, however, also suggests that a subset of participants experienced some barriers when developing their blueprint. A common challenge was time constraints - both internal and external to the CDI. For example, one faculty member shared that the 7-week timeframe for the course felt too short to master the material. In contrast, several others described how they experienced difficulties finding time to complete the course assignments due to other demands on their time (clinical hours, teaching load, department meetings, etc.). This sentiment is consistent with prior findings which have described time as a major barrier that prohibits faculty from engaging in professional development opportunities [32, 40]. Another challenge is that each participant had a particular context for their course design plan. For example, some focused on (re) designing seminar-style modules while others were planning 20-week courses. There was also a mix of participants who were creating courses from scratch and those who were conducting a redesign of an existing class. Additionally, certain participants were required to work within the constraints set by their departments, such as required learning objectives to meet accreditation standards; others, however, were permitted the academic freedom to explore their own pedagogical choices.

Even when challenges were encountered, participants shared how they have continued to utilize the skills they developed in the CDI. Several educators indicated that they have since used the blueprint to design additional courses they teach. Others discussed how the blueprint provided them with a straightforward process for course design that has been particularly helpful when developing instruction with a co-instructor or other stakeholders. Faculty participants also discussed how they gained perspectives by participating in this course that has allow them to improve the usability of their instruction and develop confidence in justifying and advocating for student-centered learning practices. These findings indicate how the faculty have begun translating their skills beyond this training and into their instructional practice. This transfer of developed skills is critical for empowering faculty to adopt best practices. Previous studies have demonstrated that participation in faculty development opportunities can encourage educators to reflect on their teaching practices and philosophy, develop and implement more student-centered courses, and increase the alignment between learning objectives and classroom activities or assessments [13, 22, 29, 50]. In several cases, however, the educators discussed how barriers such as human resources, departmental resistance, and budgetary constraints prevented them from immediately employing all the skills they developed in the CDI. These findings indicate how faculty development opportunities are an important component of empowering individuals to shift the educational culture at institutions. However, other factors such as buy-in from key stakeholders, protected time for course design, and access to funding for new instructional techniques or technologies are also critical. Nevertheless, when faculty were able to implement student-centered practices, they described how they noticed improvements to the student learning experience and outcomes. While the impact on students was not measured directly in the present study, descriptions from the faculty indicated that their everyday observations included increased engagement and retention of knowledge, improved ability for self-directed learning, and development of test-taking skills.

Attitudes

Possessing knowledge and skills, however, does not necessarily correlate with motivation to apply them. Instead, self-efficacy is often used to describe the belief in one’s ability to conduct behaviors in order to achieve desired performance [52, 53]. This is also related to motivation, persistence, performance, and professional identity [53,54,55,56]. While self-efficacy is a complicated and multifaceted concept, previous literature suggests that educators with less teaching experience may also have lower self-efficacy [57, 58], which has implications for faculty development programs. Several factors including vicarious experiences, mastery experiences, feedback (also described as verbal persuasion), and emotional arousal have been identified as critical for designing interventions that can increase self-efficacy amongst participants [53, 55]. Instructors of the CDI provided vicarious experiences by modeling learner-centered instructional practices throughout the course and discussing their own teaching and learning experiences. Participants also had opportunities to engage in mastery experiences as they iteratively developed their course design blueprints and submitted a polished final version. Feedback (participant-participant and instructor-participant) and encouragement were provided frequently throughout the course via verbal dialogue, in-class activities, and written feedback on homework assignments and discussion boards. Faculty development sessions can be vulnerable spaces for participants as it can be uncomfortable to ask questions in front of one’s peers or to acknowledge what is unknown related to one’s current occupation. Therefore, the instructors endeavored to make the course engaging and dynamic while promoting a “brave” learning environment where participants were encouraged to draw from their previous experiences, discuss successes and challenges with peers, and to try new methods. Together these elements of the course may have contributed to the development of the observed self-efficacy measures and a sense of personal growth amongst the CDI participants that have been noted by faculty development programs offered through other institutions [13, 29].

Another critical outcome measure is satisfaction with the program, particularly given that participation in this CDI was voluntary. The data from both the survey and the focus groups suggested that participants were satisfied with the experience and appreciated that the course was relevant to them as educators in the health sciences and provided access to engaging instructional activities as well as resources and support in ways that scaffolded the learning appropriately. Participants noted that they would not only recommend the program to a colleague but also believe it should be a required course for all educators at the institution. Several participants remarked on how they appreciated opportunities to collaborate and learn amongst colleagues from other disciplines. Another common theme of the discussion was how this experience compared to other faculty development opportunities. The feedback suggested that the CDI was perceived as a critical learning opportunity for all participants regardless of their prior training. This finding was somewhat surprising, given that the course was designed to be an introductory course and approximately half of the participants had previously completed graduate-level coursework (including masters or doctorates of education) or post-professional certifications in education. Comments also suggested that faculty participants often felt uncomfortable providing feedback to their peers through the discussion board or peer reviews and found these activities to be some of the least effective in the course. This response was partly due to faculty feeling like they were still learning the topics themselves and therefore not feeling confident in giving feedback yet to others. These data suggest that faculty desire continuing education in pedagogy much like many healthcare practitioners are required to complete annual training in their respective disciplines.

Another common throughline in the feedback on the CDI was related to timing and balancing efforts in the class with other tasks and job responsibilities. Interestingly, while several comments described feeling like it was difficult to budget time to attend class virtually, others suggested they would like the CDI to be longer or would prefer to meet in person (which can increase engagement but also increases the time required to travel to and from class). Additionally, several respondents discussed feeling overwhelmed with completing the course planning while also being grateful for the accountability this class provided. This feedback might allude to a more profound challenge facing academics [59, 60], particularly those in the healthcare fields [61, 62], who may balance clinical duties along with other core job responsibilities including teaching, service, and research. While faculty development opportunities alone may not be a sufficient anecdote to the problem [61], it can be helpful to bring instructors together to learn from one another and to provide continuous training and support on tasks they might otherwise be tackling alone.

Limitations and Future Directions

The present study contained several limitations and also identified future directions for continuing research. One of the major limitations is the small sample size used in this study and the demographics of this group which did not represent the diversity within the health science disciplines. Therefore, additional research is needed to explore the impacts of this CDI model in a larger, more diverse cohort and to identify whether the model is transferable to other healthcare settings. Additionally, the study was conducted at a single timepoint and from the perspective of only the CDI participants. While this offers an opportunity for the educators to reflect on their experiences in the course and how they have employed their knowledge, skills, and attitudes, it is also a limitation because there was distance between the learning intervention and several of the measured outcomes. Future research could incorporate additional instruments into the pre−/post-tests (such as the TAI) to more specifically quantify the changes promoted by the CDI and a longitudinal component might serve to elucidate the relative impact of each learning activity on the application of new course design knowledge, skills, and attitudes (including confidence, self-efficacy, and teaching philosophies). Additionally, there is an opportunity to study the impact this training had on the career outcomes and trajectories of participants. Incorporating viewpoints from various stakeholders, such as students and department chairs, could also be helpful in robustly measuring how the CDI positively impacts student skill development and learning experiences. Lastly, the study included only graduates of the program and additional insight might be gleaned from faculty members who began the program but did not complete the CDI or who chose not to enroll. Research in this area might be of particular interest, given that modest participation rates in faculty development programs are notable across higher education (and vary by demographic variables) though the reasons for this are largely unknown [32, 63]. Additional studies may also help to identify potential barriers to participation and explore the factors that contribute to motivation to engage (or not engage) in faculty development, despite evidence that supports the need for continual training that would help improve educators’ teaching knowledge, skills, and performances.

Citation Diversity Statement

The scholarship of individuals with one or more minoritized identities are often under-cited relative to the number of manuscripts published in a given discipline [64, 65]. We believe that it is important to recognize that citation bias exists and has harmful impacts. As such, we sought to include references in this paper that reflect the diversity of scholars in this field (including, but not limited to, gender diversity, ethnicity, training, and background).

Conclusion

The results of this study indicate that the CDI was influential in developing the faculty’s knowledge of the course design process, promoted the application of course design and pedagogy skills amongst CDI graduates, and positively impacted their self-reported attitudes about their teaching abilities. Feedback from participants demonstrates that they recognized the value of this program in their development and would recommend it to other colleagues as well. The findings suggest that providing faculty with structured, dedicated time for professional development opportunities empowered participants to learn and apply student-centered, evidence-based learning practices in their instruction in ways that can benefit the students, other faculty, and the university as a whole. Together, this study provides evidence of the efficacy for this CDI model, which can be transferable to other institutions, particularly those centered around the health sciences.

Availability of data and materials

The data used and analyzed as part of this study are available upon reasonable request to the corresponding author.

Abbreviations

CDI:

Course Design Institute

TAI:

Teaching Appraisal Inventory

References

  1. Clark JM, Houston TK, Kolodner K, Branch WT, Levine RB, Kern DE. Teaching the teachers. J Gen Intern Med. 2004;19(3):205–14. https://doi.org/10.1111/j.1525-1497.2004.30334.x.

    Article  Google Scholar 

  2. Houston TK, Ferenchick GS, Clark JM, Bowen JL, Branch WT, Alguire P, et al. Faculty development needs. J Gen Intern Med. 2004;19:375–9. https://doi.org/10.1111/j.1525-1497.2004.30619.x.

    Article  Google Scholar 

  3. Huwendiek S, Mennin S, Dern P, Ben-David MF, Van Der Vleuten C, Tönshoff B, et al. Expertise, needs and challenges of medical educators: results of an international web survey. Med Teach. 2010;32(11):912–8. https://doi.org/10.3109/0142159X.2010.497822.

    Article  Google Scholar 

  4. Nothman S, Kaffman M, Nave R, Flugelman MY. Survey of faculty development in four Israeli medical schools: clinical faculty development is inadequate and clinical teaching is undervalued in Israeli faculties of medicine. Isr J Health Policy Res. 2021;10(1):1–12. https://doi.org/10.1186/s13584-021-00438-0.

    Article  Google Scholar 

  5. Drummond-Young M, Brown B, Noesgaard C, Lunyk-Child O, Maich NM, Mines C, et al. A comprehensive faculty development model for nursing Education. J Prof Nurs. 2010;26(3):152–61. https://doi.org/10.1016/j.profnurs.2009.04.004.

    Article  Google Scholar 

  6. Leite Â, Soares D, Sousa HFP, e, Vidal DG, Dinis MAP, Dias D. For a healthy (and) Higher Education: evidences from learning outcomes in health sciences. Educ Sci. 2020;10(6):168. https://doi.org/10.3390/educsci10060168.

    Article  Google Scholar 

  7. Muammar OM, Alkathiri MS. What really matters to faculty members attending professional development programs in higher education. Int J Acad Dev. 2021:1–13. https://doi.org/10.1080/1360144X.2021.1897987.

  8. The Higher Learning Commission https://www.hlcommission.org/ ().

  9. Chiariello B, Shaulov S, Chambers K, Ratner A. The transition from expert clinician to novice academician: the first three years. Am J Occup Ther. 2020;74(S1):7411505122p1. https://doi.org/10.5014/ajot.2020.74S1-PO3026.

    Article  Google Scholar 

  10. Miller-Young J, Poth CN. ‘Complexifying’ our approach to evaluating educational development outcomes: bridging theoretical innovations with frontline practice. Int J Acad Dev. 2021:1–14. https://doi.org/10.1080/1360144X.2021.1887876.

  11. Smethers RD, Smallidge DL, Giblin-Scanlon LJ, Perry KR. Experiences and challenges of clinical dental hygienists transitioning into teaching roles. J Dent Hyg. 2018;92(9):40–6.

    Google Scholar 

  12. Stenson S. From the clinic to the classroom: reflections of an experienced clinician transitioning to a novice academic. Aust Nurs Midwifery J. 2020;26(9):39.

    Google Scholar 

  13. Palmer MS, Streifer AC, Williams-Duncan S. Systematic Assessment of a High–Impact Course Design Institute. To Improv Acad. 2016;35(2):1–25. https://doi.org/10.3998/tia.17063888.0035.203.

    Article  Google Scholar 

  14. Meixner C, Altman M, Good MR, Ben WE, Altman M, Good M, et al. Longitudinal impact of faculty participation in a course design institute (CDI) faculty motivation and perception of expectancy, value, and cost. To Improv Acad. 2021;40(1):49–74. https://doi.org/10.3998/tia.959.

    Article  Google Scholar 

  15. Fink LD. Creating significant learning experiences: an integrated approach to designing college courses. San Francisco, CA: Jossey-Bass; 2013.

    Google Scholar 

  16. Daumiller M, Rinas R, Olden D, Dresel M. Academics’ motivations in professional training courses: effects on learning engagement and learning gains. Int J Acad Dev. 2021;26:7–23. https://doi.org/10.1080/1360144X.2020.1768396.

    Article  Google Scholar 

  17. Behar-Horenstein LS, Garvan CW, Catalanotto FA, Su Y, Feng X. Assessing faculty development needs among Florida’s allied dental faculty. J Dent Hyg. 2016;90:52–9.

    Google Scholar 

  18. Leslie K, Baker L, Egan-Lee E, Esdaile M, Reeves S. Advancing faculty development in medical Education. Acad Med. 2013;88:1038–45. https://doi.org/10.1097/ACM.0b013e318294fd29.

    Article  Google Scholar 

  19. McLean M, Cilliers F, Van Wyk JM. Faculty development: yesterday, today and tomorrow. Med Teach. 2008;30:555–84. https://doi.org/10.1080/01421590802109834.

    Article  Google Scholar 

  20. Searle NS, Thibault GE, Greenberg SB. Faculty development for medical educators: current barriers and future directions. Acad Med. 2011;86:405–6. https://doi.org/10.1097/ACM.0b013e31820dc1b3.

    Article  Google Scholar 

  21. Burgess A, Matar E, Neuen B, Fox GJ. A longitudinal faculty development program: supporting a culture of teaching. BMC Med Educ. 2019;19:1–9. https://doi.org/10.1186/S12909-019-1832-3/TABLES/3.

    Article  Google Scholar 

  22. Chia CF, Nadarajah VD, Lim V, Kutzsche S. Transfer of knowledge, skills and confidence from a faculty development programme for health professions educators into practice. Med Teach. 2021;43:S46–52. https://doi.org/10.1080/0142159X.2020.1776239.

    Article  Google Scholar 

  23. Fidler DC, Khakoo R, Miller LA. Teaching scholars programs: faculty development for educators in the health professions. Acad Psychiatry. 2007;31:472–8. https://doi.org/10.1176/appi.ap.31.6.472.

    Article  Google Scholar 

  24. Payne EK, Walker SE, Mazerolle S. Exploring athletic training educators’ development as teachers article in athletic training. Educ J. 2017. https://doi.org/10.4085/1202134.

  25. Wiggins G, McTighe J. Understanding by Design. 2nd ed. Alexandria, VA: Association for Supervision and Curriculum Development; 2005.

    Google Scholar 

  26. Wheeler LB, Bach D. Understanding the impact of educational development interventions on classroom instruction and student success. Int J Acad Dev. 2021;26:24–40. https://doi.org/10.1080/1360144X.2020.1777555.

    Article  Google Scholar 

  27. Emory J. Understanding backward design to strengthen curricular models. Nurse Educ. 2014;39:122–5. https://doi.org/10.1097/NNE.0000000000000034.

    Article  Google Scholar 

  28. Wright BM, Hornsby L, Marlowe KF, Fowlin J, Surry DW. Innovating pharmacy curriculum through backward design. TechTrends. 2018;62:224–9. https://doi.org/10.1007/s11528-018-0283-8.

    Article  Google Scholar 

  29. Favre DE, Bach D, Wheeler LB. Measuring institutional transformation: a multifaceted assessment of a new faculty development program. J Res Innov Teach Learn. 2021;14:378–98. https://doi.org/10.1108/JRIT-04-2020-0023.

    Article  Google Scholar 

  30. Colosi L, Dunifon R. What’s the difference? “Post then pre” & “Pre then Post”: Cornell Cooperative Extension; 2006.

    Google Scholar 

  31. Balam EM. Professors’ teaching effectiveness in relation to self-efficacy beliefs and perceptions of student rating myths: Auburn University; 2006.

    Google Scholar 

  32. Steinert Y, Macdonald ME, Boillat M, Elizov M, Meterissian S, Razack S, et al. Faculty development: if you build it, they will come. Med Educ. 2010;44:900–7. https://doi.org/10.1111/j.1365-2923.2010.03746.x.

    Article  Google Scholar 

  33. Kite J, Phongsavan P. Insights for conducting real-time focus groups online using a web conferencing service. F1000Res. 2017;6:122. https://doi.org/10.12688/f1000research.10427.1.

    Article  Google Scholar 

  34. Côté-Arsenault D, Morrison-Beedy D. Practical advice for planning and conducting focus groups. Nurs Res. 1999;48:280–3. https://doi.org/10.1097/00006199-199909000-00009.

    Article  Google Scholar 

  35. Gill P, Stewart K, Treasure E, Chadwick B. Methods of data collection in qualitative research: interviews and focus groups. Br Dent J. 2008;204:291–5. https://doi.org/10.1038/bdj.2008.192.

    Article  Google Scholar 

  36. Blair E. A reflexive exploration of two qualitative data coding techniques. J Methods Meas Soc Sci. 2015;6:14–29.

    Google Scholar 

  37. Speer JE, Lyon M, Johnson J. Gains and losses in virtual mentorship: a descriptive case study of undergraduate mentees and graduate mentors in STEM research during the COVID-19 pandemic. CBE—life. Sci Educ. 2021;20:ar14. https://doi.org/10.1187/cbe.20-06-0128.

    Article  Google Scholar 

  38. Shrout PE, Fleiss JL. Intraclass correlations: uses in assessing rater reliability. Psychol Bull. 1979;86:420–8. https://doi.org/10.1037/0033-2909.86.2.420.

    Article  Google Scholar 

  39. Landers R. Computing Intraclass correlations (ICC) as estimates of interrater reliability in SPSS. The Winnow. 2015:e143518.81744. https://doi.org/10.15200/winn.143518.81744.

  40. Brownell SE, Tanner KD. Barriers to faculty pedagogical change: lack of training, time, incentives, and … tensions with professional identity? CBE—life. Sci Educ. 2012;11:339–46. https://doi.org/10.1187/cbe.12-09-0163.

    Article  Google Scholar 

  41. Silander C, Stigmar M. What university teachers need to know - perceptions of course content in higher education pedagogical courses. Int J Acad Dev. 2021:1–14. https://doi.org/10.1080/1360144X.2021.1984923.

  42. Kang SHK. Spaced repetition promotes efficient and effective learning. Policy Insights from Behav Brain Sci 2016;3:12–19. https://doi.org/https://doi.org/10.1177/2372732215624708.

  43. Pernar LIM, Corso K, Lipsitz SR, Breen E. Using spaced education to teach interns about teaching skills. Am J Surg. 2013;206:120–7. https://doi.org/10.1016/j.amjsurg.2012.05.034.

    Article  Google Scholar 

  44. Phillips JL, Heneka N, Bhattarai P, Fraser C, Shaw T. Effectiveness of the spaced education pedagogy for clinicians’ continuing professional development: a systematic review. Med Educ. 2019;53:886–902. https://doi.org/10.1111/medu.13895.

    Article  Google Scholar 

  45. Abigail LKM. Do communities of practice enhance faculty development? Heal Prof Educ. 2016;2:61–74. https://doi.org/10.1016/j.hpe.2016.08.004.

    Article  Google Scholar 

  46. Anderson JR, Reder LM, Simon HA. Situated Learning and Education Educ Res. 1996;25:5–11. https://doi.org/10.3102/0013189X025004005.

    Article  Google Scholar 

  47. Eddy PL, Hao Y, Markiewicz C, Iverson E. Faculty change agents as adult learners: the power of situated learning. Community Coll J Res Pract. 2019;43:539–55. https://doi.org/10.1080/10668926.2018.1507848.

    Article  Google Scholar 

  48. Cole KA, Barker LR, Kolodner K, Williamson P, Wright SM, Kern DE. Faculty development in teaching skills: an intensive longitudinal model. Acad Med. 2004;79:469–80. https://doi.org/10.1097/00001888-200405000-00019.

    Article  Google Scholar 

  49. Dennick R. Long-term retention of teaching skills after attending the teaching improvement project: a longitudinal, self-evaluation study. Med Teach. 2003;25:314–8. https://doi.org/10.1080/0142159031000100436.

    Article  Google Scholar 

  50. Felder RM, Brent R. The National Effective Teaching Institute: assessment of impact and implications for faculty development. J Eng Educ. 2010;99:121–34. https://doi.org/10.1002/j.2168-9830.2010.tb01049.x.

    Article  Google Scholar 

  51. Gozu A, Windish DM, Knight AM, Thomas PA, Kolodner K, Bass EB, et al. Long-term follow-up of a 10-month programme in curriculum development for medical educators: a cohort study. Med Educ. 2008;42:684–92. https://doi.org/10.1111/j.1365-2923.2008.03090.x.

    Article  Google Scholar 

  52. Artino AR. Academic self-efficacy: from educational theory to instructional practice. Perspect Med Educ. 2012;1:76–85. https://doi.org/10.1007/s40037-012-0012-5.

    Article  Google Scholar 

  53. Bandura A. Self-efficacy mechanism in human agency. Am Psychol. 1982;37:122–47. https://doi.org/10.1037/0003-066X.37.2.122.

    Article  Google Scholar 

  54. Canrinus ET, Helms-Lorenz M, Beijaard D, Buitink J, Hofman A. Self-efficacy, job satisfaction, motivation and commitment: exploring the relationships between indicators of teachers’ professional identity. Eur J Psychol Educ. 2012;27:115–32. https://doi.org/10.1007/s10212-011-0069-2.

    Article  Google Scholar 

  55. Schunk DH. Self-efficacy, motivation, and performance. J Appl Sport Psychol. 1995;7:112–37. https://doi.org/10.1080/10413209508406961.

    Article  Google Scholar 

  56. Soemantri D, Findyartini A, Greviana N, Mustika R, Felaza E, Wahid M, et al. Deconstructing the professional identity formation of basic science teachers in medical education. Adv Heal Sci Educ. 2022. https://doi.org/10.1007/s10459-022-10150-6.

  57. Klassen RM, Chiu MM. Effects on teachers’ self-efficacy and job satisfaction: teacher gender, years of experience, and job stress. J Educ Psychol. 2010;102:741–56. https://doi.org/10.1037/a0019237.

    Article  Google Scholar 

  58. Wolters CA, Daugherty SG. Goal structures and teachers’ sense of efficacy: their relation and association to teaching experience and academic level. J Educ Psychol. 2007;99:181–93. https://doi.org/10.1037/0022-0663.99.1.181.

    Article  Google Scholar 

  59. Anees RT, Heidler P, Cavaliere LPL, Nordin NA. Brain Drain in Higher Education. The impact of job stress and workload on turnover intention and the mediating role of job satisfaction at universities. Eur J Bus Manag Res 2021;6:1–8. https://doi.org/10.24018/ejbmr.2021.6.3.849.

  60. Fernández-Suárez I, García-González MA, Torrano F, García-González G. Study of the prevalence of burnout in university professors in the period 2005–2020. Educ Res Int. 2021;2021:1–10. https://doi.org/10.1155/2021/7810659.

    Article  Google Scholar 

  61. El-Ibiary SY, Yam L, Lee KC. Assessment of burnout and associated risk factors among pharmacy practice Faculty in the United States. Am J Pharm Educ. 2017;81:a75. https://doi.org/10.5688/ajpe81475.

    Article  Google Scholar 

  62. Hosseini M, Soltanian M, Torabizadeh C, Shirazi ZH. Prevalence of burnout and related factors in nursing faculty members: a systematic review. J Educ Eval Health Prof. 2022;19:16. https://doi.org/10.3352/jeehp.2022.19.16.

    Article  Google Scholar 

  63. de Vries S, Jansen EPWA, van de Grift WJCM. Profiling teachers’ continuing professional development and the relation with their beliefs about learning and teaching. Teach Teach Educ. 2013;33:78–89. https://doi.org/10.1016/j.tate.2013.02.006.

    Article  Google Scholar 

  64. Zurn P, Bassett DS, Rust NC. The citation diversity statement: a practice of transparency, a way of life. Trends Cogn Sci. 2020;24:669–72. https://doi.org/10.1016/j.tics.2020.06.009.

    Article  Google Scholar 

  65. Dworkin JD, Linn KA, Teich EG, Zurn P, Shinohara RT, Bassett DS. The extent and drivers of gender imbalance in neuroscience reference lists. Nat Neurosci. 2020;23:918–26. https://doi.org/10.1038/s41593-020-0658-y.

    Article  Google Scholar 

Download references

Acknowledgments

We would like to recognize and thank the participants of the Course Design Institute for sharing their perspectives and time during the duration of this study.

Author Biographies

Julie Speer, PhD: Dr. Julie Speer is an instructional designer in the Teaching & Learning Center at A.T. Still University. She is passionate about supporting faculty in developing and implementing effective, equitable, and inclusive learning environments. Her research interests include mentorship, faculty development, reflective teaching, and inclusive pedagogy. Dr. Speer earned her Ph.D. and M.S. from Washington University in St. Louis in Biomedical Engineering and a B.S. and Certificate in Medical Humanities from Drexel University.

Quincy Conley, PhD: Dr. Conley has been a dedicated instructional designer and educational researcher for over 20 years. His primary expertise is deciding what combination of appropriate instructional design techniques to use to create instructional materials to increase the achievement of targeted learning goals. His current research interests are in course design fundamentals, designing augmented reality learning experiences, and biometrics research methodology. He earned his Ph.D. from Arizona State University and his M.S. in Instructional Design & Technology and B.S. in Aerospace Science from the University of North Dakota.

Derek Thurber, MS: Derek Thurber is a learning designer and researcher with expertise in higher education curricula, program evaluation, and faculty development. He has worked across multiple institutions to empower faculty to improve the quality of their curriculum, teaching, and support of students’ learning through individualized educational and organizational development services. Derek earned his M.S. in Higher Education Administration and Policy from Northwestern University.

Brittany Williams, MS: Brittany Williams is Assistant Director of the Teaching & Learning Center at A.T. Still University. She has over ten years of experience working with faculty and students in three different higher education settings. She is currently a member of the POD Network and Past Chair of the Healthcare Educational Development (HED) Special Interest Group. Her current research interests are in course design and scenario-based eLearning. She earned her M.S. from Barry University in Organizational Learning and Leadership Specialization in Higher Education.

Mitzi Wasden, DDS, MS: Dr. Mitzi Wasden is an assistant professor in the department of Pediatric Dentistry at the Arizona School of Dentistry and Oral Health within A.T. Still University. She is a board-certified pediatric dentist and serves as an examiner for the Board. After enjoying 20 years as a private practice pediatric dentist, she sold her practice to follow her passion of teaching. She is committed to learning current pedagogical methods and evidence-based teaching strategies in order to strengthen her skills as an instructor. Dr. Wasden earned her D.D.S. and M.S. from The Ohio State University and a certificate of Pediatric Dentistry from Nationwide Children’s Hospital in Columbus, Ohio.

Brenda Jackson, BS: Brenda Jackson is Senior Administrative Assistant for Academic Affairs, including the Teaching & Learning Center at A.T. Still University and has six years of experience working in higher education. She earned her B.S. degree from Mars Hill College, Mars Hill, North Carolina.

Funding

Funding for this study was provided by the A.T. Still University Small Grant Program and was used to support the participant incentives (post-course research survey and focus group).

Author information

Authors and Affiliations

Authors

Contributions

JS, QC, BW, DT, and MW designed and performed research, analyzed data, and wrote the manuscript. BJ contributed to data collection and manuscript development. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Quincy Conley.

Ethics declarations

Ethics approval and consent to participate

This study was approved by the A.T. Still University Arizona Institutional Review Board (IRB #2022–060) and all methods were conducted in accordance with the relevant guidelines and regulations. Namely, participants included in this study did so voluntarily. The purposes of the research and participants’ rights were explained before any data was collected. The participants were informed that they could refuse to take part, and if they did participate in the study, all data would be treated confidentially. Informed consent was obtained from all participants included in the study.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1.

Additional file

Additional file 2.

Additional file

Additional file 3.

Additional file

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Speer, J., Conley, Q., Thurber, D. et al. A mixed-methods study of the effectiveness and perceptions of a course design institute for health science educators. BMC Med Educ 22, 873 (2022). https://doi.org/10.1186/s12909-022-03910-w

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12909-022-03910-w

Keywords