Previous studies have reported that new doctors’ preparedness varied with medical school attended but did not explain why students of one medical school were better prepared than those of another school. Our findings indicate that the factors that need to be present to support preparedness for practice are opportunities for learning on the job and having a role that enables engagement in supervised clinical practice. Previous literature did not identify that the amount of learning on the job influenced preparedness.
As before, communication has been identified as a strong area of perceived preparedness [18–20, 25, 26, 30]. However, our data suggests that these new doctors quickly ‘got out of their depth’ when dealing with the more complex communications such as dealing with patients and relatives following bad news, a finding also reported elsewhere [10, 29]. Some clinicians felt that no amount of rehearsal could fully prepare graduates for the real thing, but having greater exposure to these situations, including being exposed to complex communications as a result of being on the wards more as students, would have given medical graduates more preparation in the past. There was a limit to the extent to which certain aspects of work could be learned or rehearsed in a classroom setting and then transferred to the clinical setting. These included topics identified under the qualitative theme ‘managing the duties of a doctor’, such as working on call, acute care and prioritising work. It was these areas, together with prescribing, that were identified as the areas in which F1s were less prepared and it was recognised that these skills are best learned on the job, indicating that in terms of preparation for practice other forms of learning are not a substitute for real ward experiences. These areas of weakness have also been reported elsewhere [28, 29, 43].
The data analyses led to the identification of one core category ‘on-the-job learning’ that pulled the other categories together and accounted for variation within categories. It highlighted the importance of exposure to real practice, in the real context with all the complexity and crowdedness of the clinical environment. There has been an assumption that students can learn effectively away from the clinical environment  and can be signed off as competent in alternative settings such as a classroom or a simulated environment. Our findings stress that learning outside the clinical environment may not be equivalent to real clinical experience; there are routines, procedures and knowledge that it is felt can only be imparted and experienced in the clinical environment. Our findings indicate that better preparedness for practice is achieved from more exposure to clinical practice that involves supervised practice. Supporting evidence for this is found elsewhere: for example, students from those medical schools that have a full year of practice following final exams self-rated themselves highest on preparedness [3, 45].
The current generation of junior doctors in the UK generally start work with less on-the-job experience than earlier generations. There are several reasons for this. Most obvious is the demise of the student ‘locum’ post where undergraduates in the past would work as junior doctors. Changes in practice and concerns over patient safety put an end to this. The Foundation Programme is more closely supervised than earlier posts. There are also changes in the in-patient profile, with more severely ill patients with shorter hospital stays, giving students less opportunity to see patients as the patients tend to be receiving “management” e.g. x-rays, or surgery and so are less available for students to examine. The organisation of the clinical workforce has changed. Teams have changed from the stability of ‘firms’ to transient teams that are constantly changing due to shift work, shorter placements and in response to the dispersal of patients across many wards. The increase in student numbers has also increased the competition for access to learning opportunities with patients. A consequence of these changes is that the graduates in our study reported an absence of a role during clinical placements. They have been outsiders, in the role of passive observers. Dornan  argues that apprenticeship has come under severe strain. However apprenticeship still enables the learning of tacit knowledge which is best achieved through modelling in practice.
This paper has attempted to relate the findings of the study to the work of Lave and Wenger on communities of practice with emphasis on apprenticeship learning and working within a community of practice [35, 47]. The apprenticeship model of legitimate but peripheral participation highlights problems when final year students continue to observe rather than be supported in supervised practice (thus hindering their preparedness for practice as a new doctor).
Lave and Wenger  argue that learning must be understood with respect to practice as a whole. Above we have suggested why students do not get access to practice as a whole, with the main driver for change being a concern with patient safety. This focus on patient safety has led to a move away from learning on the job and into other contexts, such as simulation. In the final year, students need to be more prepared for practice by taking on more legitimate tasks in preparation for the workplace. The novice student is initially ‘peripheral’ to the team but with increased competence, particularly in the final year, the student needs to participate in practice and acquire skills to support the more independent status expected of an F1 doctor. This would involve the final year student having a clear role and becoming part of the team.
Wenger , p74 states that “being included in what matters is a requirement for being engaged in a community’s practice”. Wenger argues that both participation and reification are required to negotiate meaning. Wenger ,p55 makes a distinction between participation and reification; (referring to the processes and objects that have meaning within the community of practice ,p59).This helps us to understand that participation is not enough; there is a need to engage with processes. The junior doctors found processes such as managing paperwork, prescribing and prioritising patients were as challenging as participation, which at times involved being the first to respond to an acute scenario and deciding what actions to start as well as who to call. Wenger , p164 highlights that participation and non-participation also contribute to the formation of professional identity, stating that non-participation also serves to define identity and role. Certain tasks and responsibilities were expected of the junior doctors, but at times the expectations became much broader when no senior doctors were able to respond quickly.
Wenger describes three dimensions of practice as the property of a community: mutual engagement, joint enterprise and a shared repertoire. Mutual engagement requires “not only our competence but also the competence of others” , p76, this is particularly meaningful in the context of starting work as a junior doctor. Joint enterprise is the negotiation of a shared activity and a shared repertoire includes “routines, words, tools, ways of doing things…” , p83. It is clear that accessing the joint enterprise and shared repertoire is challenging when less training is provided on the job.
We know from the interviews that students were signed off as competent to perform certain tasks but these tasks were frequently in simulated contexts. The competency literature tends to assume that competencies are generalisable but Eraut  comments that there is little evidence to support this. Eraut highlights the importance of identifying the domain in which an individual is deemed to be competent by considering context, conditions and situations.
The transition from newcomer to expert is illustrated in the five stage model by Dreyfus and Dreyfus  in which the novice arrives with little situational judgement and, with increasing exposure and experience, gains tacit knowledge. Competence is reached at level three when the learner is able to cope with crowded busy contexts again reflecting the importance of learning in practice. Eraut  stressed that newly qualified professionals survive by learning how to reduce the cognitive load by learning how to prioritise and routinise their work. This process then frees up more time for thinking and to interact with others. Moreover Eraut stresses the importance of working alongside others in practice. This provides the opportunity to gain different types of knowledge including tacit knowledge, about the workplace. Lastly Eraut also  suggested that confidence to proactively seek learning opportunities is an important factor in workplace learning. Our data highlighted the importance of students having a role in the team in supervised practice to enable them to learn about the duties and responsibilities of a new doctor in advance of starting work. Similarly Cantillon and MacDermott  argue that real practice and responsibility drives learning.
Our study compared three medical schools, selected because they were diverse. We expected to find differences in preparedness but did not. With similar amounts of on-the-job training (28–32 weeks) but different curriculum approaches and student intake, students’ preparedness/lack of preparedness was the same – the missing element being participatory learning opportunities in the workplace.
This paper has focused on perceptions of preparedness to start work rather than using an external objective measure of preparedness. However, perceptions are important as they guide and impact on performance, confidence and ability to do the job. In this study, the perceptions of the trainees were very similar to those of experienced clinicians.
There were slight differences in sampling due to the time constraints for recruitment, with Medical School 3 forced to take a more opportunistic approach. Nonetheless, the Medical School 3 sample did represent a demographic cross-section and a range of MTAS scores.