Skip to main content
  • Research article
  • Open access
  • Published:

Comparing student outcomes in traditional vs intensive, online graduate programs in health professional education

Abstract

Background

Health professions’ education programs are undergoing enormous changes, including increasing use of online and intensive, or time reduced, courses. Although evidence is mounting for online and intensive course formats as separate designs, literature investigating online and intensive formats in health professional education is lacking. The purpose of the study was to compare student outcomes (final grades and course evaluation ratings) for equivalent courses in semester long (15-week) versus intensive (7-week) online formats in graduate health sciences courses.

Methods

This retrospective, observational study compared satisfaction and performance scores of students enrolled in three graduate health sciences programs in a large, urban US university. Descriptive statistics, chi square analysis, and independent t-tests were used to describe student samples and determine differences in student satisfaction and performance.

Results

The results demonstrated no significant differences for four applicable items on the final student course evaluations (p values range from 0.127 to 1.00) between semester long and intensive course formats. Similarly, student performance scores for final assignment and final grades showed no significant differences (p = 0.35 and 0.690 respectively) between semester long and intensive course formats.

Conclusion

Findings from this study suggest that 7-week and 15-week online courses can be equally effective with regard to student satisfaction and performance outcomes. While further study is recommended, academic programs should consider intensive online course formats as an alternative to semester long online course formats.

Peer Review reports

Background

Health professions’ education programs are undergoing enormous changes that are, in part, reactions to changes in students’ preferences and demographics as well as to increasing technological advances. Recent evidence suggests that current adult students expect flexibility in the delivery mode and structure of undergraduate and graduate education [1,2,3]. These expectations include the use of online delivery models and intensive course structures. These expectations also impact health and medical professional education, as online and intensive courses (ICs) are increasingly being implemented in various health professions’ curricula [4,5,6].

Recently, student registration in online courses has significantly increased. In 2015, 29.7% of all students in US higher education were taking at least one distance education course, representing a 3.9% increase from the previous year [7]. Research noting the advantages of online education explains this increase. Adult learners enroll in online programs for increased accessibility, flexibility in delivery mode, and self-direction in the process of learning [8]. Within health and medical professional education, authors confirm increased accessibility, flexibility and self-direction as benefits of online learning, but also note additional benefits such as increased interactivity among participants and improved cost [9,10,11]. With regard to learning in health professions and medical education, Cook et al. [12] conducted a systematic review investigating studies on the effects of online instruction and learning compared to studies with no online component. The authors found a large positive effect of online courses compared to no instruction and similar effectiveness compared to more traditional delivery methods. Reis et al. [13] investigated 40 medical students who learned urology content in either a face-to-face lecture format or student centered group discussions in a 4-week online course using the Moodle platform (modular object-oriented dynamic learning environment). The results demonstrated that 86% of the students thought the online course was superior to the face-to-face delivery method. Specifically, the online content was better in “encouraging and motivating learning,” “arousing interest in the topic,” and “fostering teacher and student interactions” (pg. 152). Interestingly, the authors found a smaller range of grades on the final “increment in learning” assessment for the students in the online format (7.0–9.7) as compared to the face-to-face delivery format (4.0–9.6).

In adopting online delivery models, institutions of higher education have also been condensing the delivery time of courses, both online and face-to-face. ICs have been increasingly adopted in institutes of higher education [14,15,16]. ICs are defined as courses which deliver the amount of content typically presented in a traditional 15 or 16-week semester in an intensive (time reduced) period [15, 17,18,19]. Fanjoy [20] reported that course offerings of online ICs increased from 22 to 36% for 67 public, four-year institutions between 2007 and 2008 for the summer semesters. Like online learning, ICs appeal to the growing number of non-traditional students who have difficulty meeting the demands of courses more traditional in length or delivery method [8, 14]. These non-traditional students tend to be “slightly older and working students, with slightly higher GPAs than students in traditional courses” ([16], p.1109). Literature suggests that non-traditional students prefer ICs due to convenience [14], efficient use of student time [21], and shorter time to completion [22]. In addition, faculty and students think ICs promote a “continuous learning experience” that enables a more intense connection to the content because students focus on fewer classes at one time [14].

Still, higher education research suggests drawbacks to IC course structure and inconclusive evidence regarding overall effectiveness and student satisfaction. ICs require a more concentrated effort in a shorter amount of time, thus reducing the time for students to review and learn course material and complete assignments [14, 23]. In addition, some researchers suggest this shortened time-frame may be related to the increases in student reported stress associated to the IC as compared to traditional length courses [14, 15, 23, 24]. Further, studies comparing IC to semester long courses remains inconclusive. Kucsera and Zimmaro [18] report no significant difference in instructor ratings between online ICs and semester long courses. In reviewing the literature, Hall et al. [16] suggest that a majority of investigations comparing ICs to courses of traditional length demonstrate that ICs are associated more with student success; however, a significant proportion of other studies show no difference. Results on students’ satisfaction are also mixed. Wlodkowski et al. [25] report that students’ overall attitudes toward ICs were positive in comparison to semester long courses. Whillier et al. [26] note equivalent findings regarding student satisfaction; whereas, Mishra et al. [23] find students mostly dissatisfied with ICs. It is important to note an absence of comparative research on ICs delivered online versus semester long courses delivered online.

When considering adoption of ICs in health professions education, Sonnadara and colleagues [27] found that a face-to-face IC at the beginning of a first year orthopedic residency was “highly effective” in teaching targeted surgical skills. On the other hand, Whillier and Lystad [26] concluded that a cohort of students who were involved in a semester long course attained significantly higher final grades compared to a cohort taught the same content in an IC. Regarding test scores, some evidence reports comparable results between IC and semester long courses [14] and some report slightly higher results for ICs [28, 29]. Yet, Petrowsky [30] found students in ICs performed worse on comprehensive examinations. However, as of date, there is an absence of research comparing online ICs and semester long courses, in either an online or face-to-face delivery model, in health professions education.

As online health professional programs consider transitioning from a semester long course model to an IC format, further research is necessary to clarify the effects of transitioning on student performance and satisfaction. Hence, the purpose of this paper is to compare student outcomes (performance and satisfaction) in equivalent, graduate-level, health science courses offered in a 7-week intensive (IC) online format and 15-week semester long online format.

Methods

Study aims

The aims of the study were:

  1. (1)

    To compare student course evaluation scores for equivalent courses in semester long versus intensive formats in online, graduate level health sciences courses.

  2. (2)

    To compare student performance scores for equivalent courses in semester long versus intensive formats in online, graduate level health sciences courses.

Study design

This was a retrospective, observational study. A convenience sample was selected from three health sciences graduate programs at George Washington University, School of Medicine and Health Sciences, a large urban US university. Each program offers an 18 credit graduate certificate and a 36 credit Masters of Science in Health Sciences (MSHS) degree. Table 1 describes each program briefly.

Table 1 Health Sciences Graduate Program Descriptions

Each program underwent separate but related processes to condense the 15-week semester long curricula to a 7-week intensive format. The curricular changes were coordinated among programs through a steering committee; however, individual programmatic changes were allowed to meet the program’s student outcomes and curricular needs. Students were transitioned into the new structure as they entered the program, so they did not select the curriculum format. The semester long and IC versions of the programs used asynchronous teaching methods. The IC courses were designed to include student cohorts with students taking one course at a time using a rigid sequence of courses and greater use of online technology, however the 15 week programs allowed students to take two courses a semester with greater flexibility in course sequencing. The detail of the associated changes to pedagogy and curriculum are more fully described in a companion publication [31]. The Institutional Review Board approved this study as exempt.

Course selection

To compare the effects of the curricula changes, courses within each program were selected for comparison based on two inclusion criteria. First, the course instructor was the same for each version of the course (i.e., 7- and 15-weeks). Second, the 7-week and 15-week versions of each course had the same or similar learning objectives, course content, and final assessment. These criteria were applied to control for potential differences in faculty instruction and course assignments. The final assessment in each course was a written, evidence-based paper. A total of seven pairs of courses were selected for comparison – one pair from the clinical research administration (CRA) program, two from health care quality (HCQ), and three from regulatory affairs (RAFF). In addition, one health sciences core (HSCI) course offered in all three programs was included in the analysis. Health sciences courses are foundational graduate courses that are shared among programs (i.e. biostatistics, epidemiology, leadership).

Subjects

The sample included graduate or graduate certificate student records in three programs of study, clinical research administration (CRA), health care quality (HCQ), and regulatory affairs (RAFF). As is typical with these programs, a majority of students were adults who maintained full or part time employment in health care or related fields during matriculation.

Assessments

The courses were compared on measures of student satisfaction and student performance. Student satisfaction was assessed using scores from course evaluations completed through a voluntary, online university system at end of the class. Student satisfaction assessment was based on four items from the university-developed standardized course evaluation form, including: (1) “overall rating of the course,” (2) “how much they learned in the course,” (3) “intellectual challenge,” and (4) “overall instructor rating.” Each item was rated on a 5-point Likert scale. Student performance in the course was assessed in two ways: (1) the final assignment, which in most cases was a summative, evidence-based paper; and (2) the course final grade. Letter grades were reported as the equivalent mean numerical percentage (e.g., an A- was reported as 91.5%).

Data analysis

Student satisfaction and student performance data from each course were de-identified and coded by program administrative staff and provided to the researchers. Administrative staff also provided additional de-identified student data regarding age, program of study, credit hours completed, and cumulative GPA. All data analyses including sample summary and comparative statistics were performed using SAS (version 9.3, SAS Institute Inc., Cary NC, USA) and SPSS (version 24, IBM Corp., Armonk, N.Y., USA).

Descriptive statistics were generated to describe the study population, student satisfaction scores, and course grades. Chi-square analyses were conducted to compare student satisfaction scores between 7-week intensive and 15-week semester long courses. Ratings were dichotomized to “High” or very favourable (4 or 5) and “Low” or less favourable (1–3) for comparison purposes. To determine differences in the final assignment and final course grades between the two course formats, independent t-tests were run. All inferential statistics were conducted first for an overall comparison between 7-week and 15-week formats for all courses in total and second for a comparison by course.

Results

The study assessed 245 health sciences student records of which 35.1% were enrolled in the clinical research administration program (CRA), 42.9% were enrolled in the health care quality program (HCQ), and 22.0% were enrolled in the regulatory affairs program (RAFF). The study population’s mean total credit hours completed was 28.3 (range: 3–60) credits indicating that most were nearing the end of the programs of study. The mean cumulative GPA of the study population was 3.57 (range: 0–4.00).

Approximately 44% of the students in this study were between 31 to 40 years of age. Figure 1 shows the distribution of age categories for the total study population by program. Most students enrolled in the CRA program are between 26 and 40 years of age (62.6%). Within the HCQ program, 27.8% of the students are between the ages of 31 and 35 with a second peak of 16.7% of the students between the ages of 51 and 55. Almost 40% of RAFF students are between 31 to 35 years of age.

Fig. 1
figure 1

Bar graph of percent (%) of students’ shown by age category (years) for the programs

Student satisfaction

Ninety-nine students in ICs and seventy-seven students in 15-week format completed the end of course evaluations. It is important to note that the course evaluations are voluntary, so the total numbers of students’ responses will vary depending upon the courses, the survey question, and student interest. The course evaluation response rate varied between 20 and 100% of eligible respondents over all courses (Appendix). In the overall comparison of student satisfaction, results from the chi-square analysis indicated no significant differences in ratings between 7-week and 15-week formats for all four items in the course evaluations (Table 2). Comparisons by each course yield similar findings except for the Health Care Quality Course #2 (Appendix). Students in the intensive 7-week curricula reported significantly lower satisfaction course evaluation ratings (i.e. 1–3) for both intellectual challenge (p = 0.033) and instructor rating (p = 0.026).

Table 2 Chi-Square results for student course evaluation ratings

Student performance

A total of 245 student grades (final grade assignment and final grade) were analyzed from the three programs in both the 15-week and 7-week versions of the courses, including 136 enrolled in the ICs and 109 in the semester long courses. Results of the overall comparisons in student performance measures indicate no significant differences between intensive and semester long course formats (Table 3). In addition, no significant differences were found between IC and semester format in final grade and final assignment for each course (Table 4 & Table 5). Differences in mean final assignment and course grades were small.

Table 3 Independent t-Test results for student performance
Table 4 Final assignment grade comparison 7-week versus 15-week (by course)
Table 5 Final course grade comparison 7-week versus 15-week (by course)

Discussion

While previous studies have compared the effectiveness of semester long courses and ICs delivered in face-to-face contexts with inconclusive results, this study is unique in that it considered the effectiveness of 15-week and 7-week format for online courses. While the results of prior research related to student satisfaction with face-to-face ICs and face-to-face semester long courses were mixed [1,2,3,4] this study indicates no significant difference in student satisfaction between ICs and semester long courses delivered online. This research confirms the Kucsera and Zimmaro [18] findings of no significant difference in instructor ratings and the Whillier and Lystad [26] findings of no significant difference in overall student satisfaction. As higher education and health professions’ education seek to identify learning models which meet the needs of a wider range of students, including non-traditional, working adult learners, the findings support that adoption of IC models for online courses as a viable choice.

The results of the study confirm previous findings where no significant difference in student success was found between ICs and semester long courses [6] and disconfirms Whillier and Lystad’s [26]) findings that semester long course formats yield higher student success. As noted in our companion paper [31], our team adopted a very structured process for curriculum re-design when transitioning from a semester long to IC delivery model. This process and our corresponding focus on the alignment between course objectives and course assignments may help explain these findings. The controlled sequencing of courses in the IC format, which allowed for scaffolding of knowledge across courses, may also help explain these findings. For other programs of study seeking to re-design programs to optimize space and time in learning delivery, these results suggest that online ICs can be a comparable choice to 15-week extended models of delivery, particularly if emphasis in re-design is placed upon re-alignment of content to course objectives, rather than to merely condensing existing content, and sequencing courses to scaffold knowledge essential to achieving program competencies.

While the findings indicate no significant difference in student performance and satisfaction, it is important to consider the limitations of this study and what they suggest for future research. Regarding sampling, the selection of a convenience sample from three health sciences graduate programs may introduce selection bias. Regarding student satisfaction, we identified four items from the end of course evaluations to characterize “satisfaction” (i.e., overall course rating, how much learned, intellectual challenge, and instructor rating). However, definitions for these items are not provided on the evaluations; therefore, it cannot be assumed that all students interpreted the meaning of these items in the same way. In addition, course evaluations – particularly for courses with low response rates – may not present an accurate assessment of the quality of a course, especially as results may be skewed by a respondent who has an axe to grind.

Other potential limitations relate to the comparison of final assignment grades and final course grades. Final qualitative paper grades were used within this study rather than didactic tests. These types of assessments (papers) were thought to align more readily with determining achievement of course objectives; however, they are less reliable than didactic tests because they may introduce bias in grading. In addition, it is possible that there was a ceiling effect as the range of the final grades and assignments were in the “A” to “A-”range. To counteract some of the bias, rubrics for grading final assignments were used. Also, bias was mitigated by comparing courses taught by the same instructors. With regard to final course grades, grades assigned by instructors within each course varied between letter and numerical grades. Although we applied a mean numerical value to represent letter grades, our estimates of the differences in final assignment and course grades may not detect small variations in grades.

Conclusions

Findings from this study suggest that 7-week and 15-week online courses can be equally effective with regard to student satisfaction and performance outcomes. However, additional research is required as health professions education and higher education, wrestle with selecting delivery models that align both with the needs of the learner and with the needs of the faculty and institutions. In particular, additional research is needed on the faculty experience of teaching 7-week versus 15-week courses, particularly in online contexts. Variables to consider in this research are the number of courses faculty have previously taught in each delivery model and how this number might influence their experience in teaching. Also, faculty workload in facilitation in different delivery models should be considered. For future comparative educational effectiveness studies on online models of delivery, research must also consider faculty experience in online facilitation and how it might influence course evaluations, facilitation style, and grades.

Finally, additional longitudinal research is required to determine the long term effects of different delivery models on overall performance, satisfaction (both student and faculty), and retention of knowledge over time. Future research might also consider different methodological approaches, such as mixed methods, by which to assess the comparative quality of courses delivered in different models. With regard to online ICs, longitudinal research across different programs of study would allow greater understanding of the variables that influence facilitation and learning across different course content.

References

  1. Christensen CM, Horn MB, Johnson CW. Disrupting class: how disruptive innovation will change the way the world learns. New York: Mcgraw-Hill; 2011.

  2. Irvine V, Code J, Richards L. Realigning higher education for the 21st-century learner through multi-access learning. Journal Of Online Learning And Teaching. 2013;9(2).

  3. Leer R, Ivanov S. Rethinking the future of learning: the possibilities and limitations of technology in education in the 21st century. International Journal Of Organizational Innovation (Online). 2013;5(4):14.

    Google Scholar 

  4. Chapman C, Cb W, Engleberg C, Jc F, Sk C. Developing a fully online course for senior medical students. Med Educ Online. 2011;16. https://doi.org/10.3402/Meo.V16i0.5733. Accessed 12 Oct 2018.

    Article  Google Scholar 

  5. Wilbur K. Evaluating the online platform of a blended-learning pharmacist continuing education degree program. Med Educ Online. 2016;21(1):31832. https://doi.org/10.3402/meo.v21.31832.

    Article  Google Scholar 

  6. Cook DA, Garside S, Levinson AJ, Dupras DM, Montori VM. What do we mean by web-based learning? A systematic review of the variability of interventions. Med Educ. 2010;44(8):765–74. https://doi.org/10.1111/J.1365-2923.2010.03723.

  7. Allen E, Seaman J. Digital Learning Compass: Distance Education Enrollment Report 2017. Babson Survey Research Group, E-Literature And Wcet; 2017. https://Onlinelearningsurvey.Com/Reports/Digtiallearningcompassenrollment2017.Pdf. Accessed 2 Oct 2018.

  8. McDonald PL. Adult Learners And Blended Learning: A Phenomenographic Study Of Variation In Adult Learners’ Experiences Of Blended Learning In Higher Education [Dissertation]. District Of Columbia: The George Washington University; 2012.

  9. Cook DA, Dupras DM. A practical guide to developing effective web-based learning. J Gen Intern Med. 2004;19(6):698–707. https://doi.org/10.1111/J.1525-1497.2004.30029.X. Accessed 2 Oct 2018.

    Article  Google Scholar 

  10. Myers JD, Didwania A, Shah C, Jacobson D, Norwood D, Ehtesham M, et al. E-learning—the new frontier: a report from the Apdim E-learning task force. Am J Med. 2012;125(12):1234–7.

    Article  Google Scholar 

  11. Sinclair PM. The Effectiveness Of Internet-Based E-Learning On Clinician Behaviour And Patient Outcomes: A Systematic Review. Int J Nurs Stud. 5 0:57–70. https://doi.org/10.1016/J.Ijnurstu.2016.01.011.

    Article  Google Scholar 

  12. Cook DA, Levinson AJ, Garside S, Dupras DM, Erwin PJ, Montori VM. Internet-based learning in the health professions: a meta-analysis. JAMA. 2008;300(10):1181–96. https://doi.org/10.1001/Jama.300.10.1181.

    Article  Google Scholar 

  13. Reis LO, Ikari O, Taha-Neto KA, Gugliotta A, Denardi F. Delivery of a urology online course using Moodle versus didactic lectures methods. Int J Med Inform. 2015;84(2):149–54. https://doi.org/10.1016/J.Ijmedinf.2014.11.001.

    Article  Google Scholar 

  14. Daniel EL. A Review of time-shortened courses across disciplines. Coll Stud J. 2000;34(2):298–308.

  15. Davies WM. Intensive teaching formats: a review. Issues In Educational Research. 2006;16(1):1–20 

  16. Hall MV, Wilson LA, Sanger MJ. Student success in intensive versus traditional introductory college chemistry courses. J Chem Educ. 2012;89(9):1109–13.

    Article  Google Scholar 

  17. Wlodkowski RJ. Accelerated learning in colleges and universities. New Directions For Adult And Continuing Education. 2003;97(97):5–15.

    Article  Google Scholar 

  18. Kucsera JV, Zimmaro DM. Comparing the Effectiveness of Intensive and Traditional Courses. College Teaching. 2010;58(2):62–8. https://doi.org/10.1080/87567550903583769. Accessed 11 Oct 2018.

    Article  Google Scholar 

  19. Scott PA, Conrad CF. A critique of intensive courses and an agenda for research. In: Jc S, editor. Higher education: handbook of theory and research. New York: Agathon Press. 1992. p. 411–59.

  20. Fanjoy A. Summer Sessions Associations’ Joint Statistical Report: University Of Delaware; 2008.

  21. Ovalle MDM, Combita ALF. A Comparison Between Students Behavior And Performance During Regular And Intensive Control Systems Courses With And Without Laboratory Time. Ieee Global Engineering Education Conference, Educon. 2016. p. 498. https://doi.org/10.1109/2feducon.2016.7474599.

  22. Wlodkowski Rj, Mauldin Je, Gahn Sw, Lumina Foundation. Learning In The Fast Lane: Adult Learners’ Persistence And Success In Accelerated College Programs. New Agenda Series[Tm]. Volume 4. 2001.

  23. Mishra S, Nargundkar R. An analysis of intensive mode pedagogy in management education in India. Int J Educ Manage. 2015;29(4):408–19. https://doi.org/10.1108/Ijem-04-2014-0050.

    Article  Google Scholar 

  24. Anastasi JS. Full-semester and abbreviated summer courses: an evaluation of student performance. Teach Psychol. 2007;34(1):19–22. https://doi.org/10.1080/00986280709336643.

    Article  Google Scholar 

  25. Wlodkowski RJ, Westover TN. Accelerated courses as a learning format for adults. Canadian Journal For The Study Of Adult Education. 1999;13(1):1–20.

  26. Whillier S, Lystad RP. Intensive mode delivery of a neuroanatomy unit: lower final grades but higher student satisfaction. Anat Sci Educ. 2013;6(5):286–93. https://doi.org/10.1002/Ase.1358.

    Article  Google Scholar 

  27. Sonnadara RR, Van Vliet A, Safir O, Alman B, Ferguson P, Kraemer W, et al. Simulation-based surgical education: orthopedic boot camp: examining the effectiveness of an intensive surgical skills course. Surgery. 2011;149:745–9. https://doi.org/10.1016/J.Surg.2010.11.011.

    Article  Google Scholar 

  28. Van Scyoc LJ, Gleason J. Traditional or intensive course lengths? A comparison of outcomes in economics learning. Journal Of Economic Education. 1993;24:15–22. https://doi.org/10.1080/00220485.1993.10844775.

    Article  Google Scholar 

  29. Waechter RF. A comparison of achievement and retention by college junior students in an earth science course after learning under massed and spaced conditions [dissertation]. Ann Arbor: Pennsylvania State University; 1966.

  30. Petrowsky Mc, Glendale Cc. The two week summer macroeconomics course: success or failure? 1996. 

  31. McDonald PL, Harwood KJ, Butler JT, Schlumpf KS, Eschmann CW, Drago D. Design for success: identifying a process for transitioning to an intensive online course delivery model in health professions education. Med Educ Online. 2018;23:1415617. https://doi.org/10.1080/10872981.2017.1415617.

    Article  Google Scholar 

Download references

Acknowledgments

The authors wish to acknowledge Alexandra Rosenberg and Carson Eschmann for their contributions in data collection, analysis and manuscript writing.

Funding

The study was unfunded.

Availability of data and materials

The datasets used and/or analysed during the current study are available from the corresponding author on reasonable request. Although data has been de-identified, there may be restriction on certain data sharing protected by the United States Family Educational Rights and Privacy Act of 1974 (FERPA).

Author information

Authors and Affiliations

Authors

Contributions

KH, PM KS, DD, JB: made substantial contributions to conception and design, or acquisition of data, or analysis and interpretation of data. KH PM and KS wrote the first draft of the manuscript and DD and JB reviewed the manuscript and made significant contributions before submission. All authors reviewed and approved the final manuscript. KH, PM KS, DD, JB agreed to be accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved.

Corresponding author

Correspondence to Kenneth J. Harwood.

Ethics declarations

Ethics approval and consent to participate

The George Washington University Institutional Review Board reviewed the study and determined that it was exempt from IRB review under regulatory category 1 & 2 (IRB number 041529). This retrospective review of student outcomes had identifiers eliminated prior to inclusion in study.

Consent for publication

No materials related to individual people is included in the manuscript.

Competing interests

The authors declare that they have no competing interest or conflicts of interest.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix

Appendix

Student Course Evaluation Ratings by Course

Table 6 Chi-square results for evaluation item on “Overall course rating” (by course)
Table 7 Chi-square results for evaluation item on “How much learned” (by course)
Table 8 Chi-square results for evaluation item on “Intellectual challenge” (by course)
Table 9 Chi-square results for evaluation item on “Overall instructor rating” (by course)

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Harwood, K.J., McDonald, P.L., Butler, J.T. et al. Comparing student outcomes in traditional vs intensive, online graduate programs in health professional education. BMC Med Educ 18, 240 (2018). https://doi.org/10.1186/s12909-018-1343-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12909-018-1343-7

Keywords