Open Access
Open Peer Review

This article has Open Peer Review reports available.

How does Open Peer Review work?

The development and implementation of a curriculum to improve clinicians' self-directed learning skills: a pilot project

  • Dawn MT Bravata1, 2Email author,
  • Stephen J Huot2,
  • Hadley S Abernathy3,
  • Kelley M Skeff6 and
  • Dena MC Bravata4, 5, 6
BMC Medical Education20033:7

DOI: 10.1186/1472-6920-3-7

Received: 15 April 2003

Accepted: 22 October 2003

Published: 22 October 2003

Abstract

Background

Clinicians need self-directed learning skills to maintain competency. The objective of this study was to develop and implement a curriculum to teach physicians self-directed learning skills during inpatient ward rotations.

Methods

Residents and attendings from an internal medicine residency were assigned to intervention or control groups; intervention physicians completed self-directed learning curricular exercises.

Results

Among the 43 intervention physicians, 21 (49%) completed pre- and post-curriculum tests; and 10 (23%) completed the one-year test. Immediately after exposure to the curriculum, the proportion of physicians defining short- and long-term learning goals increased [short-term: 1/21 (5%) to 11/21 (52%), p = 0.001; long-term: 2/21 (10%) to 15/21 (71%), p = 0.001]. There were no significant changes post-curriculum in the quantity or quality of clinical question asking. The physicians' mean self-efficacy (on a 100-point scale) improved for their abilities to develop a plan to keep up with the medical literature (59 vs. 72, p = 0.04). The effects of the curriculum on self-reported learning behaviors was maintained from the immediate post-curriculum test to the one-year post curriculum test: [short-term learning goals: 1/21 (5%) pre-, 11/21 (52%) immediately post-, and 5/10 (50%) one-year after the curriculum (p = 0.0075 for the pre- vs one-year comparison); long-term learning goals: 2/21 (10%) pre-, 15/21 (71%) immediately post-, and 7/10 (70%) one-year (p = 0.0013 for the pre- vs one-year comparison). At one-year, half of the participants reported changed learning behaviors.

Conclusions

A four-week curriculum may improve self-directed learning skills.

Keywords

Medical education physician learning

Background

Clinicians are expected to remain current with the medical literature to maintain their clinical competence. With over 2 million biomedical articles published annually and a medical literature that has grown increasingly complex, physicians struggle to remain informed of the many new therapies and diagnostic tools that relate to their practices. [13] Despite investing $3 billion a year, the most widely used continuing medical education (CME) programs such as lectures, short conferences, and written information have rarely been shown to change clinical practices or improve patient outcomes. [411] The CME programs that are most likely to promote the adoption of new behaviors are those that use multiple educational strategies including physician-specific feedback, practice-based educational programs, physician participation in the design of the educational interventions, contact with local opinion leaders, and self-directed programs (problem-based learning directed by the physician, in which the patient serves as the impetus for the learning experience). [4, 1217]

Adult learning theory posits that adults learn best when they are required to address problems (problem-based as opposed to subject-based learning). [18, 19] Furthermore, learning is maximized when it is self-directed so that adults study material that is most relevant to them. [18, 19] Clinicians engage in self-directed learning by first identifying a clinical problem, then pursuing the learning task, next acquiring the new knowledge or skill, and finally practicing the new knowledge or skill. [20] A recent survey of housestaff practices demonstrated that residents engage in self-directed learning activities less than eight hours per week, an amount considered inadequate to prepare housestaff for self-directed learning after training. [21] Although medical educators have described the importance of improving house officers' self-directed learning skills, no consensus exists regarding the best methods for providing such training. [22]

In this paper, we describe a curricular intervention that employs several educational and administrative modalities to teach attending and resident physicians the skills necessary for self-directed learning. The curriculum was designed to incorporate principles of adult-learning theory. The curriculum contains exercises to enhance skills for assessing learning needs, developing flexible short- and long-term learning plans, and asking and answering clinical questions efficiently. We sought to characterize the effects of this teaching program on physicians' (1) self-reported learning behaviors, (2) capacity for asking and answering well-constructed clinical questions, and (3) self-efficacy for performing essential self-directed learning behaviors.

Methods

Design

This study was designed as a prospective cohort trial of the effects of a self-directed learning curriculum on physicians' learning behaviors. The study protocol received Human Investigations Committee approval, and all participants provided written informed consent.

Subjects and Setting

Resident and attending physicians from a university internal medicine residency program assigned to one-month ward rotations at a Connecticut community teaching hospital between September 1997 and April 1998 were enrolled in the study. Physicians assigned to one of the two internal medicine ward teams at the hospital were designated as the intervention team. Interns and residents were assigned randomly to either the intervention or non-intervention team. In general, attendings physicians were assigned so that the general internal medicine attendings were assigned to the intervention team. Ward teams consisted of one attending physician and two resident-intern pairs. Each resident-intern pair admitted new patients every fourth day. Attending physicians made morning work rounds with the housestaff five to seven days per week and conducted formal attending rounds three times a week. Participation in the study was voluntary.

Curriculum

The curriculum consisted of five components: (1) performing a learning needs assessment, (2) using appropriate learning resources, (3) developing efficiency in reading medical journals, (4) developing and supporting a learning plan, and (5) asking clinical questions. In addition, each participant was asked to maintain a clinical-question diary. The curriculum physically consisted of a bound volume containing: an introduction, a sample calendar for the ward rotation, a consent form, a pre-test, components 1 through 5, a post-test, an evaluation form, and selected references. Accompanying the bound curriculum was a pocket-sized clinical-question diary. A brief description of each component follows.

Learning-Needs Assessment

The goal of this component of the curriculum was to guide participants in identifying specific learning needs. Based on published information about learning contracts and self-formulated learning plans,[18, 19] we developed exercises to help participants identify areas of medical knowledge or clinical skills in which they needed improvement or practice[23, 24] Participants were asked to articulate specific short- and long-term learning needs based on questions from their practices, board certification or in-service examinations, comments from their colleagues, and their general interest. The physicians were asked to define specific time frames for their various learning goals.

Learning-Resources Exercise

The goal of this component of the curriculum was to have participants identify optimal resources for answering clinical questions. Participants examined the usefulness of common learning resources and were asked to expand the types of learning resources that they employed in answering clinical questions. They were asked to list all possible sources of medical information, identify the sources that they used most commonly, and rate each source in terms of relevance and convenience (using a 10-point scale, where 10 was the most convenient or most relevant). They then graphed each source of information on a grid, with relevance on the x-axis and convenience on the y-axis to determine whether sources clustered (e.g., low relevance clustered with high convenience).

Prior research has demonstrated that physicians use a variety of learning resources;[25, 26] that convenience drives many physicians' choice of learning resources;[9] and that as learners become more skillful in using a wider variety of resources, they find that some resources are more convenient to use than they had previously thought. [10] Participants were encouraged to maximize both convenience and relevance in their choice of learning resources, and to also consider situations where they might emphasize relevance over convenience.

Journal-Reading Exercise

The goal of this component of the curriculum was to help the clinicians optimize their journal-reading time and habits. Surveys have indicated that practitioners prefer journal-reading to all other continuing-education activities. [9] Research has shown that physicians can benefit more from their medical reading if they learn skills and habits that permit them to select the most clinically useful articles. [27] This exercise used a group-learning format where members of the team were asked to teach their colleagues techniques to identify appropriate journals, to scan potentially relevant articles, to set up and maintain a filing system of critical references, and to set reasonable goals for keeping up with the medical literature. Participants were asked to consider the problems associated with using journals as the primary source for clinical information, the circumstances under which sources other than original journal articles (e.g., textbooks or colleagues) might serve their learning needs, the difference between their goals for library-reading and for home-reading, and the criteria by which they should include or exclude particular journals from routine reading. This curricular component was developed based on the hypothesis that by explicitly stating ideal journal-reading habits participants would become more aware of their own behaviors and improve the efficiency of their journal-reading.

Learning-Plan Development and Support

The goal for this component of the curriculum was to have participants complete the learning contract to formulate a realistic and specific personal plan for meeting the learning goals that they identified in the needs assessment. The participants were asked to define both short-term (e.g., during this ward rotation) and longer-term (e.g., during this academic year) learning plans, including: group learning activities (e.g., attending rounds), self-directed learning activities (e.g., using the learning portfolio), teaching activities (e.g., work rounds), and scanning activities (e.g., text-book reading). Participants shared their goals with the other members of the team to facilitate planning of group-learning experiences that maximized the learning of their colleagues. The whole team worked together to help each individual articulate learning goals and devise learning plans. For example, if a resident stated a plan to focus on the cardiovascular physical examination during the coming month, then the attending and other resident physicians agreed to watch for interesting cardiovascular physical examination findings in the patients cared for by the team, to emphasize such findings during rounds, and to encourage that resident to teach the other physicians about such signs. During this curricular component the participating physicians were practicing two important self-directed learning skills: identifying a specific learning task (usually based on a clinical question) and completing the task (usually obtaining the answer to the question). [20]

Asking Clinical-Questions

The goal of this component of the curriculum was to have participants practice constructing questions that incorporate features of effective clinical questioning: (1) the specific patient or problem being addressed, (2) the intervention or exposure being considered, (3) a comparison to another intervention or exposure, and (4) the clinical outcome of interest. [2] Participants were asked to use this format (e.g., "Do patients with acute stroke who receive thrombolytic therapy have an increased rate of in-hospital mortality compared to similar patients who do not receive such therapy?") on work rounds, during attending rounds, and when using their question diaries.

Clinical Question Diary

The goal of this component of the curriculum was to have participants generate and record in their individual pocket-size diary at least one clinical question for each patient they admitted. They were also asked to record the learning resources used to answer the question. We modeled the question diary after the paper version of the question portfolio of the Maintenance of Competence Program (MOCOMP) of the Royal College of Physicians and Surgeons of Canada. MOCOMP is a self-directed learning program in which participating physicians maintain a personal portfolio of the questions that they would like to answer, the situations that stimulated the questions, and the educational resources that they used to answer the questions. [2729] A random survey of physician users of MOCOMP indicated that 49% of questions that participants entered into their log led them to learn something that resulted in a change in their clinical practice. [28] Other authors have demonstrated that learning is more durable when clinical questions and subsequent learning is linked to the patient who stimulated the question. [13] The use of learning portfolios has been shown to improve clinicians identification and meeting of their own learning needs. [30]

Data Collection

Participating physicians were given test instruments (described below) on the first day of their ward rotation (pre-test) and on the last day (post-test). One-year follow-up test instruments were given in April 1999 to those intervention physicians who were rotating on the Internal Medicine wards at that time. At the end of each ward month rotation the clinical question diaries were collected from the participants.

Curriculum Implementation

Prior to beginning their ward rotation, the attending physicians on the intervention teams were given a 20-minute orientation that reviewed the objectives of the study, introduced the curriculum, provided specific suggestions for the implementation of the curriculum into the ward rotation, reviewed the importance of completing all the evaluation instruments, and provided an opportunity to ask questions. Self-directed learning curricular exercises occurred during the initial 10 to 15 minutes of the first six attending rounds of each rotation. Participants used their remaining attending rounds to practice these skills by addressing specific patient-derived questions.

Specific educational and organizational methods were used to facilitate implementation and use of the curriculum. The educational methods employed to teach these self-directed learning skills included: individual study (e.g., physicians read individually on topics related to patients on the service), group study (e.g., the team performed the learning-resource exercise together), and the use of attending physicians as role models. In addition to these educational methods, the curriculum provided an organizational structure for the ward-month experience, as well as administrative tools that facilitated review and evaluation of learning experiences (e.g., the learning plan). The learning diary served as a record of all patients admitted during the month, obviating the need for other redundant record keeping methods that had historically been used.

In addition, participating physicians were asked to adopt several important attitudes: to think of themselves as learners; to consider their ongoing professional development as a responsibility to themselves, their patients, and their colleagues; to discard the notion that physicians should have all the answers and to embrace situations where gaps in their medical knowledge generate learning opportunities; and to value helping colleagues achieve their learning goals.

Curriculum Evaluation

The five components of the curriculum were evaluated with 30 item pre-, post-, and 1-year follow-up test instruments to assess the participants': (1) self-reported learning behaviors, (2) abilities to generate clinical questions from a clinical scenario and to describe a strategy for answering those questions, and (3) self-efficacy to perform specific self-directed learning behaviors. The test instruments were identical with the exception of the clinical scenarios. They were developed specifically for the current study and were pilot tested on five physicians who were not participating in the current study for face validity and clarity. Two of the authors, blinded to the physicians' group assignment and other data, rated the questions that the physicians generated in response to the clinical scenarios according to the four criteria of good clinical questions defined above. The raters were trained during the pilot phase to achieve excellent inter-rater reliability. We assessed self-reported learning behaviors with eight short-answer questions and self-efficacy for self-directed learning behaviors with 12 questions rated on a scale of 0 to 100, where 0 indicates no confidence and 100 indicates extreme confidence. [31] The clinical question diaries were collected from all participants to evaluate if they had been used.

Statistical Analysis

Microsoft Excel 5.0 (Redmond, Oregon 1995) was used for database management and statistical analyses. Means and standard deviations were calculated for continuous variables. Differences in responses of participants on pre-tests, post-tests, and 1-year follow-up tests were assessed with non-paired two-tailed Student's t-tests, Chi-square tests, or Fisher's Exact test; where statistical significance was accepted for p-values <0.05.

Results

Subjects and Response Rates

The intervention group consisted of a total of 43 physicians; 37 residents and six attendings. All of the eligible residents and attendings in the intervention group participated in the curriculum. Of the 43 participants, 21 (49%) completed pre-tests, 21 (49%) completed post-tests. The physicians who used the curriculum but did not submit pre- or post-tests reported participating fully in the curriculum. Ten of the 43 (23%) were present on the wards one-year after the curriculum and all 10 completed the 1-year follow-up test.

Short-Term Effects of the Curriculum: Pre-Test vs. Post-Test Results

To evaluate the short-term effects of the curriculum, participants' pre-curriculum test results were compared to participants' post-curriculum test results.

Self-Reported Learning Behaviors

The proportion of intervention physicians who had defined short-term goals increased after exposure to the curriculum: 1/21 (5%) to 11/21 (52%), p = 0.001 (Table 1). Similarly, the proportion of physicians with long-term learning goals increased after participating in the curriculum: 2/21 (10%) to 15/21 (71%), p = 0.001 (Table 1). No statistically significant differences were observed in any of the other self-directed learning behaviors (Table 1).
Table 1

Effects on Self-directed Learning Behaviors: Pre-Test, Post-Test, and One-Year Post-Test Results

Self-Reported Self-directed Learning Behaviors

Pre-curriculum N = 21

Post-curriculum N = 21

One-Year N = 10

P-value*

Defining short-term learning goals: N (%)

1 (5)

11 (52)

5 (50)

0.001

Defining long-term learning goals: N (%)

2 (10)

15 (71)

7 (70)

0.001

Hours spent reading per week: mean ± standard deviation

2.6 ± 3.1

3.1 ± 2.5

4.6 ± 3.1

0.6

Hours would like to spend reading per week: mean ± standard deviation

7.7 ± 4.0

7.0 ± 4.0

11.2 ± 5.2

0.6

Percent of reading goals achieved per week: mean ± standard deviation

27% ± 22%

38% ± 21%

44% ± 21%

0.1

Number of clinical questions asked per week: mean ± standard deviation

19.1 ± 17.3

17.2 ± 22.9

13.7 ± 8.4

0.8

Percent of clinical questions answered per week: mean ± standard deviation

35% ± 28%

50% ± 33%

48% ± 28%

0.1

*These p-values are for differences between the pre-test and the post-test results.

Clinical Questions

There were no significant changes in either the quantity or quality of clinical questions that participants generated in response to the clinical cases (mean number of questions per subject: pre-curriculum 3.6 versus 3.1 post-curriculum, p = 0.05). There were no changes post-curriculum in the proportions of questions that made a comparison to another intervention, that identified a clinical outcome of interest, or that failed to meet any of criteria of a well-constructed clinical question.

Self-Efficacy for Self-directed Learning Activities

The physicians' mean self-efficacy improved from pre- to post-curriculum for their abilities to develop a plan to keep up with the medical literature (59 pre- versus 72 post-curriculum, p = 0.04) but was not statistically significant for the other behaviors (data not shown).

Clinical Question Diary

All of the participants in the intervention group used the clinical question diaries. All of the diaries demonstrated that participants asked clinical questions and identified the learning resource used to answer the questions. The most positive feedback about the clinical question diaries came from first-year residents, who found them to be helpful in remembering the many clinical questions that arose during the day, and useful in organizing and focusing their medical reading. One attending commented that the diary was a helpful tool for self-reflection. The most negative comments came from the senior residents, who stated that the diaries increased the documentation burden of their busy ward-rotation (despite the fact that the diaries replaced the previously used method of recording all patients admitted during the month, and therefore should not have increased the need for documentation).

Durability of Curriculum Effects: Post-Test vs. One-Year Post-Curriculum Results

The effects of the curriculum on self-reported learning behaviors appeared to be maintained from the immediate post-curriculum test to the one-year post curriculum test (Table 1). For example, the proportion of physicians with short-term learning goals started at 1/21 (5%) pre-curriculum, rose to 11/21 (52%) immediately post-curriculum, and remained at 5/10 (50%) one-year after the curriculum (p = 0.0075 for the pre- vs one-year comparison). Similarly, the proportion of physicians with long-term learning goals was 2/21 (10%) before the curriculum, increased to 15/21 (71%) post-curriculum, and remained at 7/10 (70%) one-year later (p = 0.0013 for the pre- vs one-year comparison).

Overall Change In Learning Behaviors

One-half of the participants surveyed at 1-year reported that they had changed their learning behaviors as a result of their participation in the curriculum. Comments included, "It increased my motivation to attempt answering clinical questions," "I research clinical questions more frequently," and "I am better at goal setting." The half who said they had not changed their learning practices wrote, for example: "I have continued to use the same resources as prior to participation," "Time on [the] rotation was limited, [I was] unable to fully participate, time constraints prevented full attention to curriculum goals," and "having [had] several mentors of evidence-based medicine prior to going through the curriculum, I already had a handle on how to answer questions, the challenge has been actually doing it."

Discussion

We developed a curriculum comprising multiple educational and structural components to teach physicians self-directed learning skills. We found that the curriculum was effectively increased the number of short- and long-term learning goals that the participating physicians articulated and improved participants' self-efficacy for their abilities to develop a plan to keep up with the medical literature.

A major limitation of this study was the low response rate. As a result, the final sample size was small, limiting our ability to detect statistically significant differences. We have no information to suggest that physicians who completed their evaluation materials were different from those who did not with regard to their participation in the curriculum.

An additional important limitation of this study is the lack of comparison to a control group. Although this study was originally designed to include a comparison between physicians on the intervention team and non-intervention (control) teams, we were not able to complete this comparison because too few of the control physicians returned their test instruments to make the analysis valid.

Our study demonstrated the difficulties inherent in conducting educational research during residency training. The magnitude of these difficulties was surprising given the academic nature of the residency program, the university appointments of the attending physicians, and the general environment that has historically emphasized the need for scholarship and has encouraged participation in clinical research on the part of the faculty and the housestaff. The attending physicians were encouraged to take ownership of this curriculum and this study, and their ideas and feedback were solicited. The curriculum contained all the necessary materials and extra copies were made readily available. Participants were given numerous personal reminders about the need to complete the evaluation materials. As mentioned above, the learning diaries were accepted as replacements for documentation that was required of the residents. Given the reality of the busy pace of inpatient medicine and the many competing needs of physicians on ward teaching services, a suggestion to improve response rates for future studies is to have study authors administer each of the surveys at assigned times in lieu of regularly scheduled educational activities such as noon conference.

Another limitation of the current study is that we relied upon physician self-reports. While self-efficacy requires physician surveys, future studies should employ direct observation of self-directed learning behaviors.

Incorporating self-directed learning techniques as described in this report, may enhance the effectiveness of existing residency program curricula. As an educational technique, self-directed learning should be explicitly integrated into residency program curricula. Self-directed learning is described in adult learning theory which suggests that adult learners can identify their learning needs, find solutions to problems, base learning on experience, and self-direct their education. Self-directed programs for physicians use problem-based learning, in which the patient becomes the impetus for the learning experience, and require the learner to reflect on his or her own clinical practice. [15, 16] Shin and colleagues reported that a medical school that employed a self-directed, problem-based curriculum produced graduates who adhered more closely to published clinical practice guidelines than did graduates of a traditional medical school. [4, 15] Given the evidence that practice patterns that have been established during residency persist after graduation, and other research demonstrating that learning behaviors taught during medical school affect practice, medical educators should emphasize self-directed learning skills during residency training.

The first steps in the self-directed learning process are the definition of learning needs and the articulation of learning goals. Research has shown that physicians consistently overestimate their knowledge and underestimate gaps in that knowledge. [12] Recognition of these gaps tends to elicit a variety of responses: anxiety, guilt, and, under the best circumstances, a desire to fill them in. [19] Our curriculum helped clinicians to create personalized learning plans and encouraged them to see every patient encounter as a potential learning opportunity. We found that the proportion of physicians with defined learning goals increased after participating in the curriculum.

The most important facets of the curriculum are its strong foundation in adult learning theory, the use of multiple educational methods, incorporating non-educational components, and linking the curriculum to direct patient-care activities. Important topics for future research include evaluations of similar curricula over longer periods, in other medical arenas (e.g., ambulatory rotations), in different medical settings (e.g., non-academic environments, where there are fewer competing educational opportunities and where the burden on physicians to maintain their competence is likely to be greater), and in other medical specialties (e.g., psychiatry or surgery).

Conclusions

Our research findings demonstrate that it is feasible to implement a four-week curriculum in self-directed learning during a four-week internal medicine ward rotation, and that this curriculum may improve some components of self-directed learning.

Declarations

Acknowledgement

We thank the faculty and housestaff of the Internal Medicine Primary Care residency program at Yale University and St. Mary's Hospital, Waterbury, Connecticut for their enthusiastic participation. Dr. Dawn Bravata is supported by a Career Development Award from the Department of Veteran Affairs Health Services Research & Development Service.

Authors’ Affiliations

(1)
Clinical Epidemiology Research Center (CERC), VA Connecticut Healthcare System
(2)
Department of Internal Medicine, Yale University School of Medicine
(3)
Convent of the Sacred Heart School
(4)
VA Palo Alto Healthcare System
(5)
Center for Primary Care & Outcomes Research, Stanford University School of Medicine
(6)
Department of Internal Medicine, Stanford University School of Medicine

References

  1. Bernier C, Yerkey A: Cogent Communication: Overcoming Information Overload. 1979, Westport, CT: Greenwood PressGoogle Scholar
  2. Michaud G, McGowan J, van der Jagt R, Dugan A, Tugwell P: The introduction of evidence-based medicine as a component of daily practice. Bulletin of the Medical Library Association. 1996, 84: 478-481.Google Scholar
  3. Rafuse J: Evidence-based medicine means MDs must develop new skills, attitudes, CMA conference told. Canadian Medical Association Journal. 1994, 150: 1479-1481.Google Scholar
  4. Haynes R, Davis D, McKibbon A, Tugwell P: A critical appraisal of the efficacy of continuing medical education. JAMA. 1984, 251: 61-64. 10.1001/jama.251.1.61.View ArticleGoogle Scholar
  5. Manning P: Continuing medical education. The next step. JAMA. 1983, 249: 1042-1045. 10.1001/jama.249.8.1042.View ArticleGoogle Scholar
  6. Hotvedt M: Continuing medical education: Actually learning rather than simply listening. JAMA. 1996, 275: 1637-1638. 10.1001/jama.275.21.1637.View ArticleGoogle Scholar
  7. Sibley J, Sackett D, Neufeld V, Gerrard B, Rudnick K, Fraser W: A randomized trial of continuing medical education. New England Journal of Medicine. 1982, 306: 511-515.View ArticleGoogle Scholar
  8. Scotti M: In reply to "Continuing medical education: Actually learning rather than simply listening" (letter). JAMA. 1996, 275: 1638.View ArticleGoogle Scholar
  9. Davis D, Thomson M, Oxman A, Haynes R: Changing physician performance. A systematic review of the effect of continuing medical education strategies. JAMA. 1995, 274: 700-705. 10.1001/jama.274.9.700.View ArticleGoogle Scholar
  10. Newble D, Whelan GEJ: Physicians' approaches to continuing education. Australian & New Zealand Journal of Medicine. 1990, 20: 739-746.View ArticleGoogle Scholar
  11. Davis D: The science and practice of continuing medical education: A study in dissonance. ACP Journal Club. 1993, 118: A-18.Google Scholar
  12. Greco P, Eisenberg J: Changing physicians' practices. New England Journal of Medicine. 1993, 329: 1271-1274. 10.1056/NEJM199310213291714.View ArticleGoogle Scholar
  13. Allery L, Owen P, Robling M: Why general practitioners and consultants change their clinical practice: A critical incident study. BMJ. 1997, 314 (22): 870-874.View ArticleGoogle Scholar
  14. Thomson M, Oxman A, Haynes R, Davis D, Freemantle N, Harvey E: Local opinion leaders to improve health professional practice and health care outcomes. Cochrane Database System Review. 1998, 4.Google Scholar
  15. Shin J, Haynes R, Johnston M: Effect of problem-based, self-directed undergraduate education on life-long learning. Canadian Medical Association Journal. 1993, 148: 969-976.Google Scholar
  16. Brigley S, Young Y, Littlejohns P, McEwen J: Continuing education for medical professionals: A reflective model. Postgraduate Medical Journal. 1997, 73: 23-26.View ArticleGoogle Scholar
  17. Grimshaw J, Eccles M, Walker A, Thomas R: Changing physicians' behavior: what works and thoughts on getting more things to work. Journal of Continuing Education in the Health Professions. 2002, 22 (4): 237-243.View ArticleGoogle Scholar
  18. Knowles M: The Modern Practice of Adult Education: From Pedagogy to Andragogy. 1983, Prentice Hall: CambridgeGoogle Scholar
  19. Knowles M: The adult learner: a neglected species. 1990, Gulf Publishing Company, 4Google Scholar
  20. Slotnick H: How doctors learn: physicians' self-directed learning episodes. Academic Medicine. 1999, 74 (10): 1106-17.View ArticleGoogle Scholar
  21. Dinkevich E, Ozuah P: Self-directed learning activities of paediatric residents. [Letter]. Medical Education. 2003, 37 (4): 388-389.View ArticleGoogle Scholar
  22. Sparling L: Enhancing the learning in self-directed learning modules. Journal for Nurses in Staff Development. 2001, 17 (4): 199-205.View ArticleGoogle Scholar
  23. Regan-Smith M: Teachers' experiential learning about learning. International Journal of Psychiatry in Medicine. 1998, 28: 11-20.View ArticleGoogle Scholar
  24. Manning PR, Clintworth WA, Sinopoli LM, Taylor JP, Krochalk PC, Gilman NJ, Denson TA, Stufflebeam DL, Knowles MS: A method of self-directed learning in continuing medical education with implications for recertification. Annals of Internal Medicine. 1987, 107 (6): 909-913.View ArticleGoogle Scholar
  25. Thompson M: Characteristics of information resources preferred by primary care physicians. Bulletin of the Medical Library Association. 1997, 85 (2): 187-92.Google Scholar
  26. Verhoeven A, Boerma E, Meyboom-de Jong B: Use of information sources by family physicians: a literature survey. Bulletin of the Medical Library Association. 1995, 83 (1): 85-90.Google Scholar
  27. Manning P, Petit D: The past, present, and future of continuing medical education. Achievements and opportunities, computers and recertification. JAMA. 1987, 258: 3542-3546. 10.1001/jama.258.24.3542.View ArticleGoogle Scholar
  28. Parboosingh J, Gondocz S: The Maintenance of Competence Program of the Royal College of Physicians and Surgeons of Canada. JAMA. 1993, 270: 1093-10.1001/jama.270.9.1093.View ArticleGoogle Scholar
  29. Parboosingh J: The Maintenance of Competence (MOCOMP) Program. Canadian Journal of Cardiology. 1993, 9: 695-697.Google Scholar
  30. Wilkinson TJ, Challis M, Hobma SO, Newble DI, Parboosingh JT, Sibbald RG, Wakeford R: The use of portfolios for assessment of the competence and performance of doctors in practice. Medical Education. 2002, 36 (10): 918-24. 10.1046/j.1365-2923.2002.01312.x.View ArticleGoogle Scholar
  31. Albright CL, Farquhar JW, Fortmann SP, Sachs DP, Owens DK, Gottlieb L, Stratos GA, Bergen MR, Skeff KM: Impact of a clinical preventive medicine curriculum for primary care faculty: results of a dissemination model. Preventive Medicine. 1992, 21 (4): 419-435.View ArticleGoogle Scholar
  32. Pre-publication history

    1. The pre-publication history for this paper can be accessed here:http://www.biomedcentral.com/1472-6920/3/7/prepub

Copyright

© Bravata et al; licensee BioMed Central Ltd. 2003

This article is published under license to BioMed Central Ltd. This is an Open Access article: verbatim copying and redistribution of this article are permitted in all media for any purpose, provided this notice is preserved along with the article's original URL.

Advertisement