Our study covered three pre-defined phases: 1) Curriculum development, 2) development and validation of the competence test, and 3) pilot testing of the training courses.
1) Curriculum development
Based on the original concept of the five-day EBM courses for international participants held at the EBM centre of Oxford University in the late nineties  we designed and piloted EBM courses for non-medical students and health professionals without academic training [24, 25]. Feasibility, acceptability and knowledge acquisition were demonstrated for these courses and we felt confident that it could be adapted for medical laypersons who act as patient or consumer representatives and professional counsellors. Specific needs of our target groups were elicited by screening available experience [21, 22]. In addition, we performed two pilot courses involving 25 participants. The results from the two pilot courses indicated that only minor changes of the training course were necessary. We also tested a 2 × 3-day format including one month break between the two parts.
Underlying theories and concepts are presented in the following paragraphs. Additional file 1 displays an overview of the final structure, specific objectives, topics, materials, and methods of the EBM training courses.
The courses were adapted for patient and consumer representatives using the theoretical framework of critical-constructive teaching methods of Klafki, which focuses on self-determination and participation abilities of students . The theoretical model of Klafki promotes systematic reflection regarding aims and intentions of instructions as a prerequisite for the development process of a curriculum.
Evidence based medicine as a problem solving strategy
We applied learning strategies developed for adult teaching . We assumed that people would have a strong internal motivation if their personal experiences as patients or patient counsellors are met. Therefore, participants were given as much opportunity as possible to work and discuss in small groups in order to bring in their own experiences. Our teaching concept was based on cognitive learning and teaching principles  in which the teacher demonstrates methods to solve problems and enables participants to transfer these methods to their own fields. We used and adapted the core elements of EBM  which included the following steps: 1) ask a question that can be answered; 2) identify appropriate sources for searching relevant information and perform a systematic literature search; 3) critically appraise selected publications on key elements; 4) communicate study results to patients and consumers. Additional teaching sessions specifically targeted 1) the basics of statistics, 2) consumer information and the media, 3) risk communication, 4) clinical testing of new drugs and drug approval, and 5) the role of patient representatives in institutional review boards.
We used various topics of controversy in medicine and health care to demonstrate the relevance and the general principles of the method of EBM: Hormone therapy in (post-) menopausal women [29–31] was used to exemplify the fallacies resulting from reliance on epidemiological studies and surrogate parameters rather than on evidence by high quality controlled trials and patient relevant outcomes. Screening for colorectal cancer was used to teach quality criteria for diagnostic tests, evaluation of screening programmes and aspects of communication about benefits and risks of screening interventions . A meta-analysis on the effects of homoeopathy  was critically appraised to exemplify the limitations and strengths of systematic reviews and meta-analyses. Two weeks prior to the commencement of the courses all participants received a handbook of study materials for advance preparation. The handbook comprised about 60 pages with publications, vocabularies, glossaries, work sheets and supplementary information.
2) Development and validation of the competence test
A systematic literature search identified several evaluation instruments for EBM programmes [34–38]. Since these instruments targeted courses for physicians and did not evaluate the skills required to communicate study results we judged them unsuitable for medical laypersons and patient representatives. Therefore, we developed a new questionnaire to assess knowledge and skills based on theoretic concepts and teaching materials developed for students and health care professionals. Five areas of evaluation reflecting the core competencies were defined: 1) "question formulation" including competencies in outline design, target population, intervention, control, and relevant outcome parameters of a clinical study (prevention of myocardial infarction by Vitamin E  was used as an example); 2) "literature search" including competency to define relevant search terms and to perform a search in the medical literature database PubMed; 3) "reading and understanding" including competency to identify study aim, number of participants, duration and location of the study, study and control interventions, and primary endpoints; 4) "calculation" including competency to calculate the event rates reported in controlled trials, the absolute and relative risks of getting a certain event, the risk reduction or the risk increase, caused by the intervention examined, and the number needed to treat or the number needed to harm using the 2 × 2 table; 5) "communication of study results" including competency to outline general aspects of evidence-based patient information and to express numbers in layperson terms as meaningful and understandable patient oriented statements. The questionnaire comprised 19 items. Possible scores ranged from 0 to 19.5. Answers were scored as 0, 0.5 or 1. Content validity was checked by an external expert in EBM who had not been involved in item construction. We pilot tested the questionnaire with four students at the University of Hamburg for wording and usability. Reliability and item properties of the competence test were determined within the two EBM pilot courses involving 25 participants.
To show validity of the competence test we investigated its sensitivity for EBM competency change in a group of undergraduate students of Health Sciences and Education. All students were non-medical health professionals before their University studies. Content and methods of the students' EBM course were comparable to the curriculum of the training for patient and consumer representatives. We asked the students to fill in the questionnaire before and after the EBM course. We considered a training effect of five score points as relevant. Sample size was calculated, intending a 90% power, accepting 5% alpha error and adjusting for a standard deviation of 5.9 score points. The latter value was taken from the piloting of the competence test. Based on these assumptions a group of 17 participants were required. Values were compared by paired t-test.
A total of 22 consecutive students completed the questionnaire before and after their participation in the EBM course. An additional group of 21 students participated in after course assessment only. Test results were rated by two independent researchers showing high interrater reliability (kappa 0.97). The mean change gathered by the 22 students was from 4.8 (SD 1.2) before to 13.5 (SD 3.7) scores after the course (p < 0.0001) indicating the validity of the instrument. The total after course sample of students (n = 43) reached a score of 14.4 (SD 3.3).
3) Pilot testing of the training courses
The target group of our EBM programme consisted of professional counsellors, members of self-help groups in Germany [40, 41], and professional patient advocates. We invited persons who belonged to one of these groups and expressed willingness to develop skills to critically appraise scientific literature and to use their new competencies on behalf of patient interests.
Recruitment strategies comprised announcements through newsletters, mailing lists, flyers, newspaper publications and self-help networks. Participation was free of charge. The courses took place at the University of Hamburg. The programme was accredited by three German federal states as paid five-day educational leave enabling participants in full time state employment to join the course. Some participants used their annual leave to join the programme. We offered 10 courses as one week courses from Monday to Friday and four courses as 2 × 3 days from Thursday to Saturday.
Formative and summative elements of evaluation were combined . Formative evaluation was used to improve programme performance. Evaluation sheets on teaching quality and content of the course modules were distributed daily. Summative evaluation of the programme aimed to verify that participants 1) were able to understand and acquire the methods of EBM; 2) regarded the adoption of EBM methods as personal learning goal; 3) could transfer the methods into their own area; and 4) whether the subgroups (laypersons, mainly self-help group members, professional counsellors, and professional patient advocates) differ in educational background, learning goals and implementation of gained knowledge and skills.
We also performed a group-based evaluation. Perceived benefits and deficits of the course programme were discussed in groups and surveyed using Metaplan . The content of the Metaplan cards was part of the summative evaluation and used for the content analysis .
To assess acceptability we developed a purpose-based assessment instrument. We aimed to find out, if 1) participants were enthusiastic about adopting EBM methods; 2) our programme met the individual learning goals of the participants; 3) any subgroups differed in their evaluation of the programme. The baseline personal learning goals were assessed by telephone interviews two to three days before each of the courses, assigning the answers to the main categories of learning goals, identified during the pilot courses. Nine main categories were identified which turned out to be meaningful to participants: (1) "research skills", (2) "critical appraisal skills", (3) "communication skills", (4) "advanced education", (5) "understanding of EBM", (6) "networking", (7) "empowerment", (8) "implementation", (9) "others". These categories were used to assess acceptability. Participants were asked to evaluate every module of the main course related to their personal learning goals using visual analogue scales with a scope from 0 to 100 percent. Differences between target groups have been tested by unpaired t-test.
To estimate an increase in EBM competencies we used the validated competence test. Participants were informed about pseudonymised data analysis and given option to withdraw from the study at any time. The questionnaire was completed at the end of the course. We chose not to perform a before-after test since the questionnaire took about four hours to complete. Instead, we compared the test results with those of the University students in Health Sciences and Education (see above), who had completed the comparable training. We assumed an unpaired t-test to show no significant difference between these two groups.
Evaluating long-term implementation
We assessed the long-term implementation of EBM skills using semi-structured telephone interviews six months following the course. We asked participants to comment on areas of successful implementation, barriers to implementation, and further needs to implement the acquired skills. Notes from the interviews were categorized into two types of implementation: 1) use of critical appraisal skills; 2) activation of participants to take part in health care decision making. The first type of implementation covers five different potential levels of implementation:
Level 0 (no implementation): participant reported no practice of EBM skills;
Level 1 (minor implementation): participant reported a change in attitude and limited attempt to critically evaluate patient information or expert based opinions;
Level 2 (fair implementation): participant reported use of selected skills such as literature search, critical appraisal of patient information and scientific literature;
Level 3 (implementation of major components): participant reported to have developed a question which could be answered by systematic literature search and had performed a literature search or critically appraised an original study;
Level 4 (almost complete implementation): participant reported application of almost all elements of EBM methodology and had produced a patient information or teaching programme or developed teaching modules.
Telephone interviews six months after the intervention with participants of the two pilot courses were used to construct categories for content analysis . In a first step, two raters independently generated categories. Disagreement was solved by discussion.
Summative analysis of group-based feedback
Group-based feedback of all courses was analysed using qualitative content analysis methods .