Skip to main content

The influence of objectives, learning experiences and examination blueprint on medical students' examination preparation



The influence of intended and informal curricula on examination preparation has not been extensively studied. This study aims to firstly describe how students utilized components of intended and informal curricula to guide examination preparation, and secondly to study the relationship between examination preparation and performance.


Students received a pre-examination questionnaire to identify components from the intended curriculum (objectives and examination blueprint), and informal curriculum (content emphasised during lectures and small groups), used during examination preparation. Multiple logistic regression was used to study the relationship between these variables and student performance (above versus at or below average).


Eighty-one students participated. There was no difference in the proportions using the examination blueprint, content emphasised during lectures, and content emphasised during small groups (87 – 93%) but fewer students used objectives (35%, p < 0.001). Objectives use was associated with reduced odds of above average examination performance (adjusted odds ratio 0.27 [0.07, 0.97], p = 0.04).


When preparing for the renal course examination, students were influenced at least as much by the informal as the intended curriculum. Of the two intended curriculum components, the examination blueprint appeared to be more widely used than the course objectives. This decreased use of objectives on examination preparation did not appear to have a detrimental effect on student performance.

Peer Review reports


Medical curricula are generally organized into three major pillars: objectives, learning experiences and evaluation [1]. While it is assumed that all three exert an influence on the process of learning, the relative contribution of each has not been studied [2, 3]. All structured educational activities should, ideally, have a single curriculum that guides student learning – the 'intended curriculum' [4, 5]. Hafferty defines this as "the stated, intended and formally offered and endorsed curriculum" [4]. This curriculum is best reflected by course objectives and the examination blueprint, linked to congruent learning experiences and evaluation as depicted by the 'cybernetic model' (figure 1). Any incongruence within these pillars of education may lead to the development of an alternative curriculum. For example, teaching that veers from the intended curriculum as identified by the course objectives may create a second curriculum – the 'informal curriculum' [4]. Hafferty defines this curriculum as "an unscripted, predominantly ad hoc, and highly interpersonal form of teaching and learning that takes place among and between faculty and students" [4]. The personal nature of content taught by faculty in lectures and small groups best reflects this definition of the informal curriculum. Hafferty also defines a third curriculum, the 'hidden curriculum' [4, 6], as "a set of influences that function at the level of organizational structure and culture" [4]. Repeated discrepancy between what students perceive that they need to know [for examination purposes] and the stated course objectives can potentially lead to a 'local culture' whereby a hidden curriculum is created. The hidden curriculum highlights the importance of congruency within the cybernetic model.

Figure 1
figure 1

The Cybernetic Model.

In an attempt to share the intended curriculum with students and minimise the effect of alternative curricula, at the University of Calgary 'core documents' are given to students to provide emphasis on course material and to guide learning. In addition to course objectives, most core documents also include a detailed examination blueprint. This blueprint is used to guide both the selection of course content and the end of course examination [7]. The relative influence of the components of the intended curriculum, represented by course objectives and examination blueprint, on students' preparation for the end of course examination is unknown. Similarly, the influence of the informal curriculum, represented by content taught in lectures and small groups, on examination preparation is also unknown. Knowledge of which components influence examination performance would be valuable in increasing the efficiency and effectiveness of curricular design.

The first objective of this study was to describe the frequency with which students utilized components of the intended and informal curricula to guide preparation for the first year undergraduate renal course examination. This course has a total of 108 hours of instruction, with close to 50% of these hours spent in small group teaching that closely follows relevant lectures. The second objective was to study the relationship between the use of these components and examination performance. It is our belief that by publishing the examination blueprint, the effect of the hidden curriculum is reduced, such that the process of evaluation becomes an ally rather than an enemy of the intended curriculum. Hence, it was hypothesized that publication of the objectives in the core document (intended curriculum), including the examination blueprint, would lead to increased utilization of these components by the students in their examination preparation, and consequently increased success in the examination.


On the day prior to the end-of-course certifying examination, first year medical students in the renal course were given a questionnaire that asked about their preparation for the upcoming exam. Students were asked to indicate, using a 5-point Likert type scale, the extent to which components from the intended curriculum (course objectives and the examination blueprint), and components from the informal curriculum (content emphasised during lectures and small groups), influenced their examination preparation. The responses for these four components were then dichotomised [8]. The positive responses from the scales ('agree' and 'strongly agree') were combined and given a value of 1. Negative responses were combined in a similar fashion and given a value of 0. Non-responses and neutral responses ('neither agree nor disagree') were coded as missing values.

To study the relationship of these four variables to student performance the students were divided into two groups, those with above average scores, and those with scores at or below average, on the renal end-of-course examination. Using the mean score to split the class instead of pass-fail was based upon the fact that the number of students failing was small and considering this as an end-point would reduce the power of the study. Assuming a sample size of 80 students, considering the students above average versus those at or below average, provided sufficient power (>0.80) to detect a difference of ≥ 32% in utilisation of any of these four examination preparation strategies between the two groups. Fisher's exact test was used for two-sample comparisons of proportions. The four potential influences on examination preparation were entered into a multiple logistic regression model that also considered two-variable interaction terms. The dependent variable was whether or not the students had above average examination scores, and the four individual components were considered as explanatory variables. A backward elimination procedure was used in which nested models were compared using the likelihood ratio test. All statistical tests were two-sided and a p value of <0.05 was considered statistically significant. All statistical analyses were performed using STATA 7.0 software [9].


Eighty-one of one hundred students in the class completed the questionnaire. For the end of course examination the class mean (± SD) was 83.92 (± 7.80)% and the reliability coefficient (Cronbach's alpha) was 0.74. The proportion of students using the individual components for examination preparation is shown in Table 2 and ranged from approximately 35% for objectives to 93% for content emphasised during small group sessions. There was no difference in the proportions of students using the examination blueprint, content emphasised during lectures, and content emphasised during small groups. The proportion of students using course objectives was, however, significantly lower than the other three combined (P < 0.001).

Table 2 Prevalence of variables influencing examination preparation

Comparing students above and those at or below the mean examination score, there was no difference in the proportions of students using the examination blueprint (85 versus 87.9%, p = 1.0), content emphasised during lectures (90.3 versus 94.3%, p = 0.7) and content emphasised during small groups (94.6 versus 90.3%, p = 0.6) to guide examination preparation. There was a trend towards reduced objectives use by above average students (22.2 versus 42.9%, p = 0.08). By multiple logistic regression there were no significant interactions between variables. The only variable associated with examination performance was the use of objectives, which was associated with reduced odds of having an above average score for this examination. These data are shown in Table 3.

Table 3 Relationship between variables influencing examination preparation and the odds of obtaining an above average examination score


The first objective of this study was to describe the relative influence of components of the intended and informal curricula on examination preparation in a group of first year medical students. As demonstrated in Table 2, course objectives were used by a significantly smaller proportion of students to guide examination preparation than the evaluation blueprint, content emphasised during lectures and material emphasised during small groups. The second objective was to study the relationship between the use of components of the intended and informal curricula and examination performance. We found, in Table 3, that the use of course objectives to guide examination preparation was associated with reduced odds of above average performance (or increased odds of below average performance), while use of the other three components were not significantly associated with students' performance.

Given the emphasis placed on the creation of objectives at all levels of education, the finding that course objectives were downplayed by students when preparing for examinations was unexpected. Several possible explanations for this were considered, the first being that the objectives were incongruent with the evaluation process. This was considered unlikely as the objectives and the final examination were matched using an examination blueprint. Nevertheless, to address this possibility, an analysis was performed on the information from the end-of-course questionnaire, on which students provide feedback about both the course and the certifying examination. Of the 61 (out of 100) students who responded to the statement, "the final evaluation reflected the course objectives" there were none that disagreed with the statement, suggesting objectives were not disregarded due to incongruence with the examination. Note, however, that while 61 students responded to this statement, only 28 actually used objectives.

The second explanation considered for the low utilization of objectives was that the objectives were incongruent with the material actually taught. To address this, all the course preceptors (who were given the list of course objectives prior to the course) were asked whether they used the course objectives as a guide in the preparation of the learning experiences. Of the 19 (out of 32) preceptors who responded, 16 (84%) said that they were aware of the objectives, although only 7 (37%) prepared their learning experiences based on the objectives. The low frequency of preceptors utilizing course objectives in their preparation for lectures and small groups creates the potential for an informal curriculum. Despite course objectives also being downplayed by preceptors, there appeared to be congruence between the learning experiences and the evaluation process as only 4% of students disagreed with the statement that "the exam tested material actually taught" and 2% disagreed with the statement that "the exam tested important aspects of subject matter". The observation of congruence between the material taught and the evaluation process suggests that the informal (or delivered) curriculum did not significantly veer from the path of the intended curriculum, i.e., the material taught matched the course objectives.

The third explanation considered for the low utilization of objectives was that they were poorly written. This was considered an unlikely explanation as the course objectives were essentially the same as those adopted by the Medical Council of Canada and written by the same person who has considerable expertise in this area [10].

Having failed to demonstrate incongruence between objectives and the other pillars of education, it can be concluded that the most likely explanation for objectives being disregarded was that they had become redundant in the face of other 'stronger' influences on examination preparation. As 'actions speak louder then words' it is not surprising that the proportion of students using components of the informal curriculum was greater than the proportion using objectives on examination preparation (which is not a problem if these are congruent). It is interesting to note, however, that the proportion of students using the examination blueprint was not different from the proportion using components of the informal curriculum. This observation would suggest that publishing the examination blueprint may be a better way of guiding learning by the intended curriculum than publishing the course objectives.

The second finding of the paper was an apparent negative impact of using objectives to guide examination preparation. The explanation for this is not immediately clear. As discussed above, the observed congruence between objectives and the evaluation process would suggest that the objectives themselves were not misleading. As such, it can be concluded that the more likely explanation for the association between the use of objectives and lower (or equal) than average performance was reverse causality. A possible interpretation of these is that most students did not use objectives in examination preparation because they didn't need to use them. Weaker students, on the other hand, may still have been unsure as to what they should have learned by the end of the course. Using objectives at this late stage might, therefore, be an indicator, rather than a cause, of poor examination preparation.

There are some important limitations to this study that should be considered. The study sample was a single class during a single undergraduate course and, as such, the findings may not be generalizable. Another important consideration is the fact that not all courses have an examination blueprint and it is possible, if not probable, that objectives have a greater influence on examination preparation when they represent the only published component of the intended curriculum. Similarly, the congruence between the material taught and the evaluation process may have afforded students the luxury of disregarding the objectives, which may not be the case if the informal curriculum veers significantly from the intended curriculum. Lastly, this study was limited in that exit interviews or focus groups were not conducted to explain why students did not turn to the objectives as much for guidance. These sorts of investigations would be of clear benefit in future similar studies.


The results observed in this study suggest that in preparing for the renal course examination, students appeared to be influenced at least as much by features of the informal curriculum (content in small groups and lectures) as by features of the intended curriculum (course objectives and examination blueprinting). This is an important reminder to medical educators that, once the intended curriculum is set, careful attention is required to the informal curriculum for the initial objectives to be met. With regards to the components of the intended curriculum, the examination blueprint was used by the majority of students whereas a minority employed objectives to guide examination preparation. The strategy of disregarding objectives in examination preparation appeared to be effective, suggesting that well prepared students had no need to review the objectives at this stage. This is also key in course preparation, in highlighting the importance and potential power of an examination blueprint.

We would guard against concluding from these results that objectives are redundant, as we do believe that all worthwhile educational experiences are initially driven by objectives. Instructional objectives serve two important purposes: to guide course design and assessment, and secondly to communicate expectations to students. It may be that in the final preparatory stage for examinations, when these two purposes have already been served, students turn their attention to other important features of a course, such as the examination blueprint. The notion that the more successful students were able to disregard objectives when preparing for their examination may be a testimony to the congruence between the pillars of education in a well-designed course.


  1. Suter E, Mandin H, Small P: The sciences in the education of physicians. Basic Science Educator. 1998, 8: 7-9.

    Google Scholar 

  2. Newble D, Entwistle N: Learning styles and approaches: implications for medical education. Medical Education. 1986, 20: 162-175.

    Article  Google Scholar 

  3. Curry L: Cognitive and learning styles in medical education. Academic Medicine. 1999, 74: 409-413.

    Article  Google Scholar 

  4. Hafferty F: Beyond curricular reform: confronting medicine's hidden curriculum. Academic Medicine. 1998, 73: 403-407.

    Article  Google Scholar 

  5. Prideaux D: ABC of learning and teaching in medicine: Curriculum design. BMJ. 2003, 326: 268-270. 10.1136/bmj.326.7383.268.

    Article  Google Scholar 

  6. Roberts L, McCarty T, Obenshain S: Comprehensive performance examination gives insight into the "hidden curriculum". Academic Medicine. 1999, 74: 597.

    Article  Google Scholar 

  7. Bordage G, Brailovsky C, Carretier H, Page G: Content validation of key features on a national examination of clinical decision-making skills. Academic Medicine. 1995, 70: 276-281.

    Article  Google Scholar 

  8. Jamieson S: Likert scales: how to (ab) use them. Medical Education. 2004, 38: 1217-8. 10.1111/j.1365-2929.2004.02012.x.

    Article  Google Scholar 

  9. Stata- Statistical Software for Professionals. []

  10. Mandin H, Dauphinee W: Conceptual guidelines for developing and maintaining curriculum and examination objectives: the experience of the Medical Council of Canada. Academic Medicine. 2000, 75: 1031-1037.

    Article  Google Scholar 

Pre-publication history

Download references

Author information

Authors and Affiliations


Corresponding author

Correspondence to S Coderre.

Additional information

Competing interests

The author(s) declare that they have no competing interests.

Authors' contributions

- KM conceived of the study, participated in its design, coordination and final manuscript draft

- SC participated in manuscript revisions, and drafted the final manuscript

- DM and TL participated in study design and coordination

- HM participated in a detailed revision of the initial manuscript draft

- WW participated in the study/survey design, and manuscript drafting

Table 1 The renal course blueprint

Authors’ original submitted files for images

Rights and permissions

Open Access This article is published under license to BioMed Central Ltd. This is an Open Access article is distributed under the terms of the Creative Commons Attribution License ( ), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and Permissions

About this article

Cite this article

McLaughlin, K., Coderre, S., Woloschuk, W. et al. The influence of objectives, learning experiences and examination blueprint on medical students' examination preparation. BMC Med Educ 5, 39 (2005).

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: