An important feature of a quality education program is a regular review of its curricula in order to improve it [27]. But, before so doing, teachers and researchers would like to see what impact their designed and delivered education program has had on changing students' learning behavior or achievement. Therefore, this study aimed to estimate the impact of teaching a topic in a medical education institution in Vietnam on students' learning behavior. In particular we sought to quantify the effects of the designed and delivered management and planning education on students' academic planning behavior. We did so using a fairly large sample of such students and methods that would help greatly reduce selection bias. Because we could not randomly assign students to receive or not receive MPE, we used propensity score matching techniques to contrast the behavior of students who did and who did not receive this education but who had been matched on a variety of observed background characteristics. We also used some other important analyses such as structural equation modeling to identify the model of multivariate effect.

This study provided a combination of tests - logistic regression, structural equation modeling and propensity score matching - of the effect of the MPE on ideation and on APB. In Vietnam, the MPE is delivered for the fifth academic year medical students, usually during the annual fall (from August to October). Afterwards we conducted a data collection. With data from a representative sample survey of the students, propensity score matching was able to provide a valid counter-factual condition - the matched control group and therefore an unbiased estimate of a percentage point increase in academic planning behavior that would not have happened without the MPE. Specifically, we found that students who had a higher recall of MPE contents displayed a higher proportion of academic planning behavior compared to those with a low and no recall (71.51% vs. 52.65%, p < .05) for a net increase of 18.6 percentage points. The MPE reached most of students at the fifth and sixth years as many sessions of this topic were compulsory to students when enrolled in it. An average of 2.52 MPE contents out of 8 were recalled. Bivariate analysis showed that content recall was significantly related to ideation and APB, and ideation was also strongly related to APB. More important, the multivariate analysis revealed that the level of ideation and level of MPE recall was also related to APB.

The after-only, cross-sectional regression analysis without ideation showed a significant direct effect of the communication or education campaign on the adoption of a behavior [8, 10] after controlling for socioeconomic variables. This is as much as many studies of mass media impacts are able to do [6–8, 10, 11]. When the composite measure of ideation was added to the after-only regression equation, the direct effect of the communication and/or education was no longer statistically significant because of the strong effect of ideation [6–8, 10]. However, we found in our study that the after-only regression analysis of ideation showed that the content recall had the strongest effect on ideation and the regression analysis of APB showed that the recall had the second strongest effect on APB after the ideation. Therefore, the results confirmed the indirect effects of the education, but also its direct effect on behavior, emphasized the role of mediating effect of ideation, and provided support for the theoretical model that guided the design of the education or intervention and the evaluation of the results.

The structural equation modeling could provide support for a causal inference. The three equations (represented in model 1, Table 2) controlled for potentially confounding (socioeconomic) variables that might affect behavior. After controlling for these variables, the MPE content recall had a significant effect on both ideation and APB, and ideation had a significant effect on APB. The potential effect of unobserved variables (not in the equations) and the reciprocal effect of behavior on ideation and recall were ruled out by the statistical tests for endogeneity. The only criterion missing for a causal inference was a counterfactual condition which could have been provided only in a controlled experimental design. In our study, the counterfactual condition was made by propensity score matching in order to create a matched controlled group so the comparison of net difference would be possible. However, acceptance of such a causal inference for MPE and ideation on APB does not necessarily mean that other causes were not also operating. Students would approach or may be exposed to other sources of MPE such as internet, library or others. But at least in this study we could argue that the effect on APB would be as a result of the MPE per se which was designed and delivered by the Department of Health Organization and Management because we measured the recall of the key contents taught only at the university. Comparing between the treatment group and matched control group, a 18.6 percentage point increase in APB after the education delivered may sound small. However, because the sample of 421 represents a population of 3,145 students, the actual net increase in the number of students practicing APB is estimated to be 670.

Although many efforts have been made, our study can have several limitations. First, because many sessions of MPE - lectures and practicum - were compulsory, there would be an efficient way to reach students as well as MPE. In this situation, the risk of falsely rejecting an effective treatment (Type II error) appears to be greater than the risk of falsely rejecting the null hypothesis (Type I error) [28]. Further, due to the nature of self-reported research design, recall bias would be inherent. Some students, perhaps because of their self-esteem, could over- or under-estimate their behavioral responses. However, as this study designed a survey on a fairly large, representative sample with anonymous and confidential commitment, it would partly reduce such a bias. Also, since students self-reported their academic planning behavior, further studies should combine self-reporting of APB and observation or collection of actual plans (study notes, schedules, timetable, etc) of students. Moreover, as a cross-sectional study, this design may preclude the order of causality; therefore, a longitudinal study is needed to address this concern.

Interpretations of this study provide a theoretical as well as a practical implication. Consistent with the literature, our study informed the indirect effect of the education or communication (message recall) on behavior or the intervening role of ideation between education and behavior in the theoretical framework. At the same time, we also indicated the direct effect of the education on behavior. This suggests that designs of education evaluation should include ideation in addition to the education or recall in order to obtain a holistic theoretical model to support research. Another practical concern is: Should lecturers or institution leaders take action based on this conclusion? Should they review curricula and teaching using structural equation modeling and propensity score matching? The results of this study suggest that such a technique, a reliable, but a still neglected method of research and evaluation in many educational contexts, should be rolled out to other topics of education evaluation.