Skip to main content

We just don’t have the resources”: Supervisor perspectives on introducing workplace-based assessments into medical specialist training in South Africa

Abstract

Background

South Africa (SA) is on the brink of implementing workplace-based assessments (WBA) in all medical specialist training programmes in the country. Despite the fact that competency-based medical education (CBME) has been in place for about two decades, WBA offers new and interesting challenges. The literature indicates that WBA has resource, regulatory, educational and social complexities. Implementing WBA would therefore require a careful approach to this complex challenge. To date, insufficient exploration of WBA practices, experiences, perceptions, and aspirations in healthcare have been undertaken in South Africa or Africa. The aim of this study was to identify factors that could impact WBA implementation from the perspectives of medical specialist educators. The outcomes being reported are themes derived from reported potential barriers and enablers to WBA implementation in the SA context.

Methods

This paper reports on the qualitative data generated from a mixed methods study that employed a parallel convergent design, utilising a self-administered online questionnaire to collect data from participants. Data was analysed thematically and inductively.

Results

The themes that emerged were: Structural readiness for WBA; staff capacity to implement WBA; quality assurance; and the social dynamics of WBA.

Conclusions

Participants demonstrated impressive levels of insight into their respective working environments, producing an extensive list of barriers and enablers. Despite significant structural and social barriers, this cohort perceives the impending implementation of WBA to be a positive development in registrar training in South Africa. We make recommendations for future research, and to the medical specialist educational leaders in SA.

Peer Review reports

Background

There have been some significant international shifts in the education of medical specialists over the past few years [1, 2]. These include the adoption of competency based medical education (CBME), increasing utilisation of workplace-based assessment (WBA), and the incorporation of WBA into systems of programmatic assessment in the context of CBME. For registrars (medical specialists‐in training), most of the educational contact between registrar and consultant (supervising medical specialist) happens ‘at the bedside’, and very little in the classroom. Continuous assessment of workplace performance following an iterative standardized in‐service process has the potential to bring assessment of clinical competence from an artificial context into the real world of clinical medicine, without compromising patient safety [3]. In South Africa (SA), the intended incorporation of WBA into medical specialist training programmes necessitated a review of current knowledge, practice and perceptions of WBA among those who would be implementing it.

WBA involves the assessor observing the trainee’s performance in the real world of clinical practice, providing feedback and ‘thus fostering reflective practice’ [4]. It ‘encompasses a wide range of assessment strategies’ [4] that collect and record information about trainees’ performance in the clinical setting. This information is then used to provide developmental feedback in formative assessments and make judgements in summative assessments. WBA is regarded as a valid and reliable means of assessment in health sciences education [5,6,7]. The reliability of WBA is established through adequate sampling, meaning that multiple encounters need to be observed by the assessor to achieve reliability [8, 9]. Since human observation and interpretation is a central feature of WBA the ‘assessment literacy’ [10] of the assessor—which includes knowing ‘what to look for, how to interpret, where to draw the line between satisfactory and unsatisfactory performance’ [1] —is critical. Robust and reliable decisions are reached by collating and evaluating sufficient information (data points) over a variety of assessment episodes, using information garnered along the way about trainee strengths and weaknesses to guide learning before a final decision is made [11]. International (mostly global north) experience has shown that when WBA is effectively implemented, assessment of competency is enhanced, and less emphasis is placed on the role of high stakes exit assessments, with all the variables that accompany this type of examination [11, 12]. WBA facilitates trainee learning by aligning real-world clinical experience, training program content, expected competencies, and assessment methods, providing feedback during or after observations, and using formative assessments to guide trainee learning towards desired outcomes [5]. As such, it is a form of ‘assessment for learning’ [13, 14]. Feedback and instruction become intertwined as the process of feedback does more than just report on student correctness or error but becomes the site of further guidance and instruction [15]. Hattie reports that the average effect size for feedback in school-level education, based on 12 meta-analyses, was 0.79 (twice the average) [15]. In a SA medical education context, Burch et al [16] reported that bedside feedback increased registrars’ confidence to undertake blinded patient encounters without consulting patient records prior to interviewing and examining the patient, with most students in the study recognising the learning value of feedback in terms of information-sharing, motivation, and learning behaviour. Veloski et al’s [17] systematic review demonstrated an overwhelmingly positive impact of feedback on clinician performance, being significantly impacted by the source of feedback and its duration.

A comprehensive WBA framework takes place in a socially situated space (the health facility), with clearly defined actions (well defined learning outcomes, standardised workplace formative assessments and feedback) and actors with specific roles to play (capacitated, engaged staff and students) [5, 18]. Its implementation is deeply influenced by the context in which it is practised, being grounded in the social realities of the workplace [19, 20]. Given the centrality of feedback to the process of WBA, it becomes apparent that the institutional culture and relationship between supervisors and registrars are key factors that influence the assessment outcomes [21, 22]. Student engagement in the process of WBA is also integral to its success, and attention should be paid to the social nature of learning [18]. Medical education and learning are embedded in, and shaped by, the social context in which they take place [23] and the power relations between consultant and registrar [24]. Becoming a doctor, as Lave and Wenger argue in their social learning theory, entails making the socially situated journey from being a ‘legitimate peripheral participant’ to being a fully-fledged member of a ‘community of practice’ [25, 26] thus acquiring the identity shared by other members of the community [23] and becoming a new kind of person. The socio-cultural dynamic, whether it is contextual or interpersonal, must be understood if WBA is to be an effective educational approach.

In SA, the social space and the interpersonal interactions are vulnerable to dysfunction [27], as demonstrated by the South African student movements in the recent past and one cannot assume that the relationships in clinical and educational spaces are functional and healthy [28,29,30]. Issues of discrimination, barriers related to racism, sexism, and favouritism were also found to have negative impact on the specialist training programs in South Africa [31]. Any attempt at implementing WBA in this context would need to take this reality into consideration [32]. Given the impetus needed to change workplace practice and develop assessment literacy, a significant demand on resources is made. Recent work in a postgraduate training program in South Africa highlighted the multiple demands for training and supervision resources needed [33]. In the context of developed countries, the development and implementation of WBA strategies were resource intensive [34, 35], and given the realities of lower-middle income countries (LMIC), a local response based on local realities is needed.

The impetus for implementing WBA in SA medical specialist training programmes is growing. The Colleges of Medicine of South Africa (CMSA), as the examining body for medical specialists in South Africa, has called for the integration of WBA as a core practice in training programmes [36]. This call is supported by the SA Committee of Medical Deans (SACOMD), representative of all health science faculties in the country. This collective intent to incorporate WBA into the SA context raises an important research question. Given the resource challenges of implementing a comprehensive WBA framework, the paucity of data on WBA in LMICs, and the social complexity alluded to above, what is the state of readiness of training programmes in SA to implement WBA? To answer this question, a rapid situational analysis was performed, aimed at generating data reflective of local SA realities. The key outcomes reported are a quantification of current knowledge and practices, and qualitative perceptions of programme managers and clinical supervisors of potential barriers or enablers to the successful implementation of WBA. We report on the latter, qualitative, outcome in this paper.

Methods

This was a cross-sectional, mixed methods observational study using a parallel convergent design. This is an appropriate design for this type of study as we collected the two types of data simultaneously, with the intention that they would converge post-analysis to provide a comprehensive overview of the phenomenon being studied and inform WBA design and implementation strategies [37]. In the instance of this study, the quantitative component provided data as measured against a set of objective questions based on the literature, while the qualitative component allowed participants to express their perceptions based on their subjective interpretations of their own experiences and knowledge.

The setting encompassed all South African universities offering medical specialist training programmes. This included nine health sciences faculties, spanning all medical specialties and sub-specialties. The official language for this training is English. All training sites are accredited by the Health Professions Council of South Africa (HPCSA) and funded by the National Department of Health of South Africa. It should be noted that WBA has not yet been formally adopted at any of these training sites, so participants’ knowledge and exposure to WBA praxis is reflective of the pre-implementation phase that the country is in.

Participants were drawn from all participating institutions (nine health sciences faculties in South Africa), using a purposive and snowball sampling method. The inclusion criteria were: currently a clinical supervisor of registrars OR manager of a registrar-training programme. This means that all participants were medical doctors with specialist registration, as this is a requirement of these positions. We did not stipulate a minimum or maximum time employed in the current position. There were no exclusion criteria applied. Participants were recruited by collaborators from their own institution, either directly by telephone or email, or via institutional mailing lists. We estimated that a sample of two hundred and sixteen (N = 216) respondents would represent about 80% of the training programmes in SA, which would constitute an acceptable representation of this population.

Data was collected using a self-administered online questionnaire (Appendix A). This novel questionnaire was developed by the research team using literature cited above to identify key knowledge and practice elements of WBA. Two open-ended questions were also included. It was scrutinised individually by a panel of medical educational researchers and practitioners who provided email feedback on content and face validity. After these comments were incorporated, the research team met and reached consensus on the finalised tool. Minor changes were incorporated at this stage: two questions on WBA practice were added; and items were separated into knowledge and practice domains. The questionnaire was then loaded onto the Google online platform as a fillable form, with the introductory section being the informed consent component. This online version was piloted with nine participants who were not part of the study sample. There were no changes made to the tool after this pilot study.

Responses from the completed questionnaire were automatically uploaded to a Google Sheet and downloaded as a Microsoft Excel spreadsheet. The qualitative data was extracted from this spreadsheet for analysis manually. Using Braune and Clarke’s (2006) six-step guide to thematic analysis [38] and an inductive approach, the qualitative data was iteratively read with the questions in mind, after which a set of codes were generated by a member of the research team (ED) trained in qualitative research. The codes were categorised according to their content, and where these categories were coherent, themes emerged. The themes were discussed with the lead author (TR), who also has qualitative research experience, who reviewed the analysis process to ensure that the trustworthiness criteria as defined by Lincoln and Guba were met [39].

Results

A total of one hundred and sixty-six (n = 166) individuals representing forty-four different specialty or sub-speciality training programmes and nine health sciences faculties in SA completed the online questionnaire. This represents 76% of the intended sample size (N = 216). Eighty-five respondents (51.2%) of the sample self-identified as programme convenors/managers, with the balance self-identifying as supervisors. Figure 1 indicates the relative experience within these roles.

Fig. 1
figure 1

Distribution of respondents by years of experience in medical specialist training

Participants were asked to respond in text to two open-ended questions: “What are your experiences/perceptions of factors in your clinical/academic environment that are/will be barriers to the success of WBA?” and “What are your experiences/perceptions of factors in your clinical/academic environment that are/will be enablers to the success of WBA?”. These responses yielded four themes. These themes were ‘Structural readiness and support’, ‘Staff capacity’, ‘Quality assurance’ and ‘Social dynamics of WBA’.

Structural readiness and support

The structural issues identified for implementing WBA raised by respondents were related to time, staff shortages, equipment deficiencies, perceived lack of stakeholder buy-in and the technology needed for WBA implementation.

Time needed for workplace training when weighed up against the clinical demands and perceived staff shortages emerged as a central concern.

“The biggest problem is time. In an environment with limited lecturer/student ratio pre-and post-graduate, it is impossible. The clinical demand on consultants/supervisors is preventing intense WBA on a daily basis.” (R138).

Respondents related this perceived lack of time to staff shortages leading to exigent clinical workloads and unfavorable ratios between consultants and registrars:

“Correct, adequate and accurate WBA need trained staff to be available and involved. Staff are currently overwhelmed with workload. Adding what I would feel is appropriate and fair WBA in all aspects—operating, patient assessment, patient presentations, theory etc. will need more resources—specifically more staff.” (R3).

Many respondents felt that there simply weren’t enough supervisors available to train and assess registrars as per the perceived demands of WBA in the current environment:

“…aligning availability of supervisors and registrars… not enough supervisors for number of assessments needed.” (R44).

In small disciplines and departments, this shortage of supervisors would be exacerbated by the perceived demands of WBA:

“Small disciplines and departments, where there could be one consultant, and service delivery issues.” (R43).

However, despite the heavy workload and shortages, respondents identified enabling opportunities within these spaces. The busy clinical workplace offers opportunities for exposure to a wide range of patients and clinical encounters that constitute the basis of WBA.

“Clinical service delivery areas that are busy will offer [an] opportunity for registrar exposure [to] different conditions…” (R113).

“Sufficient work-based opportunities exist that can provide assessment opportunities if there is cohesion, clarity and communication.” (R72).

Respondents identified a need for all stakeholders to buy into WBA and for proper institutional support to make it work. Resistance or lack of support from supervisors was identified as a potential stumbling block:

“Not all supervisors buy into the process. Some still have very archaic ideas how to evaluate and support (not support) (sic.) registrars and this is very difficult to change.” (R110).

The need for institutional buy-in and support was also highlighted:

“Buy-in from the Department of the reliability of such assessments.” (R136).

“We will need practical support from our University—which is lacking quite often.” (R55).

They had clear ideas of the structural and institutional enablers required for WBA, though some of them are aspirational. Three key groups were identified whose support and buy-in to the implementation of WBA they perceived as critical: institutions and administrators, consultants, and registrars. They described the need for leadership coming from Deans and Heads of Department (HoDs) and the authoritative structures to support the implementation of WBA:

“…HoD buy-in, lots of energy on change management.” (R12).

“The support of the Deanery in appreciating the value of WBAs.” (R80).

“Provincial (government) buy-in mandating the assessments and providing the opportunities on the clinical platform.” (R86).

One respondent suggested a “A single national co-ordinated process led by each CMSA college.” (R9) This is an important stakeholder as the CMSA (Colleges of Medicine of SA) is the sole examining authority for medical specialists in the country.

Respondents identified a well-regulated environment as being an enabler of the successful implementation of WBA. The need for clear guidelines, regulation, and monitoring was identified.

“Clear guidelines from the CMSA, SACOMD and Universities. Making it a mandatory requirement for progression in training.” (R4).

The importance of registrar and consultant support and buy-in was also noted, with respondents noting perceived buy-in from registrars, as shown by the following excerpts.

“Registrars thus far have really appreciated the feedback so there is a huge “buy-in” from trainees’ side.” (R62).

“The willingness by the trainers to learn and train others is a positive factor.” (R122).

The final structural issue identified by respondents identified a lack of and need for adequate technological support to capture, store, and manage assessments.

“Then, another barrier—too often we are asked to do everything ourselves—develop the assessments, do them and submit them—this needs a good electronic system that is outsourced and well managed, so that reminders are sent, data is stored appropriately and technical issues are sorted by the team as opposed to the assessors struggling to submit the assessments.” (R3).

Again, as with other barriers, the respondents offered enabling factors that address the technology concern. Existing technologies in the WBA space were seen as enabling the implementation of WBA.

“The groundwork and experiences using (a commercial software programme) and (another commercial software programme) has already been laid and the supervisors already have experience with this.” (R10).

“An already existing e-portfolio and WBA.” (R49).

Staff capacity

Supervisor capacity was seen as central to WBA implementation. Many respondents simply stipulated that training of trainers/supervisors was lacking as well as essential for an alternative system of assessment to work.

“The training of supervisors to be able to accomplish this.” (R23).

“Lack of training of supervisors/trainers…” (R32).

Respondents cited the general lack of supervisor training, variability of assessment skills, knowledge, and experience of WBA among supervisors, and the need to monitor the competence of clinical teachers.

“Supervisors who may not be keen to participate in the WBA (due to lack of knowledge/skills) and lack of resources.” (R137).

“Adequate training of and continuous monitoring of WBA competence of clinical teacher.” (R114).

Significantly, only one respondent identified feedback as an element that would require training if it were not to become a barrier to successful WBA: “…supervisor training especially on feedback.” (R33).

While lack of training has been mentioned as a barrier, respondents reported positive attitudes to implementing WBA, as highlighted by the following quotations that demonstrate respondents’ commitment to ensuring that their registrars receive good training.

“The ability and willingness for pathologists in the unit to perform the WBAs on a monthly basis currently. The value of continuous WBAs are appreciated by both pathologists and registrars in preparing them for exit exams and professional practice. The support of the Deanery in appreciating the value of WBAs.” (R80).

“I think it will be welcomed by trainees as it will provide them with constant feedback on “how they are doing”. In principle both myself and co-supervisor are in agreement that WBA could be a very useful tool—so attitude is positive.” (R104).

Quality assurance: subjectivity and standardisation

Respondents noted the role that subjectivity might play in the assessment process, making it potentially unfair to registrars:

“Very small numbers of both registrars and supervisors […] That makes it very difficult to remain objective as we work very closely with fellows and develop a personal relationship.” (R104).

Others noted time constraints as a barrier to objectivity in assessments:

“Assessments are also subjective (hence you need more which requires more time). An assessment that is expressed as a score gives false re-assurance compared to feedback only.” (R22).

Some worried that the narrow gap in seniority between registrars and junior consultants could impact the reliability of WBA.

“Junior consultants assessing registrars (whom they are barely senior to) too leniently, leading to a drop in standards.” (R24).

Respondents also expressed the perception that there was “inconsistency and subjectivity of assessments between disciplines” (R117) and that this would hinder the successful implementation of WBA. Some suggested that WBA would “need to have multiple assessors to achieve reliability and validity” (R10). This was echoed by another respondent who pointed out that:

“WBA is also a reflection of the teacher, so the teacher should not be doing the examination. It needs to be non-biased, or at least include an examiner that was not involved in teaching that section, preferably an external examiner” (R162).

Respondents also commented on the need to standardize the WBA process and its component parts. Three areas in need of standardization/agreement were identified: EPAs, Standard Operating Procedures (SOPs), and benchmarking.

A general lack of and need to standardize/agree on WBA and its components was identified by respondents, which may be a complex task given the variations across service platforms.

“The WBA would need to be simple and standardized—for this to occur, they need to be developed appropriately by a task team.” (R3).

“We need to agree on standards and expectations of the registrars in each division, this may be difficult.” (R5).

To achieve such standardization across training centers respondents identified the need to standardize the curriculum and EPAs:

“Agreement around curriculum/blueprint.” (R116).

“Agreeing on EPAs and the frequency and style of assessment.” (R9).

They also identified a lack of benchmarking and across different contexts, and the inconsistent use of clinical guidelines as a barrier:

“Benchmarking not standardized—supervisors having differing expectations….” (R39).

“Needs all the Universities to agree as well on certain practices/SOP’s/guidelines.” (R54).

Respondents further expressed concern about the lack of standardization between clinical exposure in certain disciplines, and between different clinical platforms, even possibly within the same discipline.

“Large registrar numbers in a department tend to rotate quickly through the disciplines and move on before they are fully competent in that discipline. The WBA can be unfair to some registrars for that reason.” (R29).

“…some facilities have inferior equipment and infrastructure even just simple stuff like internet access is a problem at [a named] tertiary hospital.” (R105).

“…not all training centres have equivalent facilities for training in specific areas of the subspecialty…” (R83).

Social dynamics of WBA

Some respondents commented on interpersonal relationships between consultants and registrars as a barrier to successfully implementing WBA. Furthermore, some consultants were aware of the role bias and favoritism could play in making WBA successful or not.

“Some clinicians might not see potential in a registrar or may not like him/her personally and might act with bias.” (R19).

“Biases towards a particular registrar.” (R60).

“Lack of objectivity by supervisors. Favouritisms (sic).” (R118).

“Unrecognised bias (lack of self-awareness).” (R119).

“Personality clashes between supervisor and registrar may result in bias.” (R148).

The central importance of the quality of the relationship between consultants and registrars to WBA was also noted as potential barriers.

“Supervisors/trainees poor relationship.” (R52).

“Not all supervisors buy into the process. Some still have very archaic ideas how to evaluate and support (not support)(sic) registrars and this is very difficult to change.” (R110).

Alternatively, the closeness of the working relationship between supervisor-registrar was seen as a potential enabling factor, based on respondents own past experiences.

“We already spend a lot of one-on-one time with our registrars and WBAs can very easily be incorporated in these sessions in a structured way.” (R17).

“Already close one-on-one engagement between supervisors and registrars, which will facilitate assessment.” (R34).

Discussion

In this paper we describe the qualitative findings from a mixed methods study in which we explored the perceptions of supervisors of postgraduate specialist trainees regarding barriers and enabling factors likely to impact upon the implementation of WBA in South Africa. Principally, the themes speak to the importance of context for the implementation of WBA. This context refers to the formal systems within which learning takes place, and the informal, cultural or interpersonal aspects of the training spaces.

Our findings indicate that there are substantive concerns about the lack of resources to implement WBA in the SA context. This finding is closely linked to the quality assurance theme, both representing aspects of WBA that requires interventions at policy, governance and leadership levels. Like previous studies [31], systemic barriers and enablers had to do with the resource constrained environment in which supervisors/trainers must operate. This included underfunding, understaffing, exigent clinical environments and caseloads—all of which led to a perceived lack of time to do the assessments that WBA requires—and inadequate infrastructure including a lack of equipment, diagnostic platforms, and limited access to reliable internet connectivity in the workplace. Appreciating these concerns against the backdrop of the cost of implementing WBA in well-resourced contexts [34, 35] provides further opportunity for reflection, especially around long-term sustainability of WBA in a LMIC. However, there is evidence from other resource-constrained environments such as Pakistan [40] that WBA can be adapted to resource-scarce contexts if the design and implementation process is sensitive to these contextual realities. This discussion emphasises the need for effective alignment and collaboration between decision-making structures, those with access to resources, and those tasked with implementing WBA in clinical spaces.

An interesting finding within the ‘staff capacity’ theme was the self-identified need for ongoing training and the seemingly high levels of motivation to adopt WBA among the respondents. This high level of motivation for change may represent a frustration with current practices, though this was not explored in this study. This level of motivation was replicated empirically in a SA study which documents a pilot WBA implementation in a general surgery training programme [31]. Respondents also perceived buy-in for WBA to be high both among supervisors and registrars, and despite the many challenges faced, all respondents stated a commitment to ongoing WBA practices. When viewed from the perspective of change management, a motivated and pro-active cohort, with a clear understanding of the task at hand, is critical to effective and sustainable implementation of new practices [41]. That leading structures such as the SACOMD and CMSA are fully in support of these efforts lends impetus to the high levels of motivation expressed amongst respondents. The convergence of intentional leadership and engaged, capacitated staff would bode well for the sustained implementation of WBA.

The supervisor-registrar relationship, mentioned by our respondents as the basis for the learning encounters in the workplace, must receive due consideration. Where these relationships are found to be dysfunctional, learning is materially impacted [27]. In line with existing literature [31] on the social factors that affect registrars in their training in South Africa, there was a recognition by respondents that the supervisor-registrar relationship is not always healthy or functional and may be characterised by bias, victimization, and favouritism. Proactively pursuing healthy relationships that affirm student competency as an educational imperative has been shown to enhance learning outcomes in postgraduate education [42]. With feedback being so central to clinical learning, as evidenced by multiple studies in this area [16, 17, 19, 20], and the supervisor-registrar relationship being the micro-platform for the delivery of effective feedback, the importance of functional relationships in WBA praxis is further enhanced. This praxis should therefore not only focus on modes of feedback, but should explicate the relationship as a platform for trustworthy engagement.

Limitations

The key limitation of the study is that we did not canvas the opinion of registrars regarding the barriers to, and enabler of, WBA in South Africa. This requires a separate in-depth study that will provide critical information from the perspective of trainees.

This exploratory study provided a superficial sense of perceptions of supervisors to WBA implementation in a SA context. As such, deep inferences about the learning environment cannot be made from this dataset. Additionally, while a fairly good response rate was achieved, the perspectives of those supervisors who did not complete the survey is not known—these missing participants may conceal perceived barriers and enablers that were not uncovered in this study.

A third limitation is that we only collected data from respondents via the online questionnaire, expecting them to type their responses. This may have limited the depth of their contributions when compared to verbal reports, which could have produced more depth.

Conclusion

We conducted an observational cross-sectional mixed methods study in a resource constrained context and report the qualitative data here. Supervisors and convenors demonstrated good insight into their respective working environments, producing an extensive list of barriers and enablers. Future research should focus on expanding the stakeholder engagement to include registrars, health facility managers and policy-makers, experiences of these stakeholders of early WBA implementation, explore novel WBA practices that respond to low-resourced contexts, and the social dimensions of WBA (including patient and community engagement).

We make the following recommendations to aid WBA implementation in South Africa:

  1. 1.

    An intentional alignment between all decision-making bodies during the design and early implementation phase, as well as a consensus-based monitoring process.

  2. 2.

    That the financial and non-financial costs pertaining to technology, staff capacity-building and ensuring that institutional regulations are adapted be made clear.

  3. 3.

    Social and interpersonal factors must be taken into consideration when initiating WBA practices within clinical spaces.

  4. 4.

    A standardised monitoring and evaluation process should be implemented to document the progress being made.

  5. 5.

    A structured pathway towards staff capacitation should be developed, funded and implemented.

Data Availability

The datasets used and/or analysed during the current study are available from the corresponding author on reasonable request.

References

  1. van der Schuwirth LWT. A history of assessment in medical education. Adv Health Sci Educ. 2020;25(5):1045–56.

    Article  Google Scholar 

  2. Tabish SA. Assessment methods in Medical Education. Int J Health Sci. 2008;2(2):3–7.

    Google Scholar 

  3. Hauer K, Ten Cate O, Boscardin C, Irby D, Lobst W, O’Sullivan P. Understanding trust as an essential element of trainee supervision and learning in the workplace. Adv Health Sci Educ Theory Pract. 2013;19(3):435–56.

    Google Scholar 

  4. Yousuf Guraya S. Workplace-based Assessment; applications and Educational Impact. Malays J Med Sci. 2015;22(6):6–10.

    Google Scholar 

  5. Norcini J, Burch V. Workplace-based assessment as an educational tool: AMEE Guide No. 31. Med Teach. 2007;29(9–10):855–71.

    Article  Google Scholar 

  6. Moonen-van Loon JMW, Overeem K, van der Donkers HHLM, Driessen EW. Composite reliability of a workplace-based assessment toolbox for postgraduate medical education. Adv Health Sci Educ. 2013;18(5):1087–102.

    Article  Google Scholar 

  7. van Der Vleuten CPM. The assessment of professional competence: developments, research and practical implications. Adv Health Sci Educ. 1996;1(1):41–67.

    Article  Google Scholar 

  8. Norcini JJ. The Mini-CEX (Clinical evaluation Exercise): a preliminary investigation. Ann Intern Med. 1995;123(10):795.

    Article  Google Scholar 

  9. Holmboe ES, Huot S, Chung J, Norcini J, Hawkins RE. Construct validity of the MiniClinical evaluation Exercise (MiniCEX). Acad Med. 2003;78(8):826–30.

    Article  Google Scholar 

  10. Popham WJ. Assessment literacy for teachers: faddish or fundamental? Theory Pract. 2009;48(1):4–11.

    Article  Google Scholar 

  11. Wilkinson TJ, Tweed MJ. Deconstructing programmatic assessment. Adv Med Educ Pract. 2018;9:191–7.

    Article  Google Scholar 

  12. Yepes-Rios M, Dudek N, Duboyce R, Curtis J, Allard RJ, Varpio L. The failure to fail underperforming trainees in health professions education: a BEME systematic review: BEME Guide No. 42. Med Teach. 2016;38(11):1092–9.

    Article  Google Scholar 

  13. Shepard LA. The role of Assessment in a Learning Culture. Educational Researcher. 2000;29(7):4–14.

    Article  Google Scholar 

  14. Burch V. The changing landscape of workplace-based assessment. JATT. 2019;20(2):37–59.

    Google Scholar 

  15. Hattie J, Timperley H. The power of feedback. Rev Educ Res. 2007;77(1):81–112.

    Article  Google Scholar 

  16. Burch V, Seggie J, Gary N. Formative assessment promotes learning in undergraduate clinical clerkships. S Afr Med J. 2006;96(5):430–3.

    Google Scholar 

  17. Veloski J, Boex JR, Grasberger MJ, Evans A, Wolfson DB. Systematic review of the literature on assessment, feedback and physicians’ clinical performance: BEME Guide No. 7. Med Teach. 2006;28(2):117–28.

    Article  Google Scholar 

  18. Al-Kadri HM, Van Der Al-Kadi MT. Workplace-based assessment and students’ approaches to learning: a qualitative inquiry. Med Teach. 2013;35(sup1):31–8.

    Article  Google Scholar 

  19. Ramani S, Könings KD, Mann KV, van der Pisarski EE. About politeness, face, and feedback. Acad Med. 2018;93(9):1348–58.

    Article  Google Scholar 

  20. Yang M, Carless D. The feedback triangle and the enhancement of dialogic feedback processes. Teach High Educ. 2013;18(3):285–97.

    Article  Google Scholar 

  21. Bing-You R, Ramani S, Ramesh S, Hayes V, Varaklis K, Ward D, et al. The interplay between residency program culture and feedback culture: a cross-sectional study exploring perceptions of residents at three institutions. Med Educ Online. 2019;24(1):1611296.

    Article  Google Scholar 

  22. Erumeda NJ, Jenkins LS, George AZ. Perceptions of resources available for postgraduate family medicine training at a South African university. Afr J Prim Health Care Fam Med. 2022;14(1).

  23. Cruess RL, Cruess SR, Steinert Y. Medicine as a community of practice. Acad Med. 2018;93(2):185–91.

    Article  Google Scholar 

  24. Castanelli DJ, Weller JM, Molloy E, Bearman M. Trust, power and learning in workplace-based assessment: the trainee perspective. Med Educ. 2022;56(3):280–91.

    Article  Google Scholar 

  25. Wenger E. Communities of Practice and Social Learning systems. Organization. 2000;7(2):225–46.

    Article  Google Scholar 

  26. Lave J, Wenger E. Situated learning: legitimate peripheral participation. Cambridge: Cambridge University Press; 1991.

    Book  Google Scholar 

  27. Bagwandeen CI, Singaram VS. Effects of demographic factors on provision of feedback in postgraduate medical education. South Afr J High Educ. 2017;32(1).

  28. Bezuidenhout A, Cilliers FVN. Burnout, work engagement and sense of coherence in female academics in higher-education institutions in South Africa. SA J Industrial Psychol. 2010;36(1).

  29. Khine AA, Hartman N. Strategies in overcoming racial and socio-cultural differences in the learning environment of post-graduate medical specialty training in South Africa. MedEdPublish. 2018;7:62.

    Article  Google Scholar 

  30. Thackwell N, Swartz L, Dlamini S, Phahladira L, Muloiwa R, Chiliza B. Race trouble: experiences of black medical specialist trainees in South Africa. BMC Int Health Hum Rights. 2016;16(1):31.

    Article  Google Scholar 

  31. Singaram V, Baboolal S. The use, effectiveness and impact of workplace-based assessments on teaching, supervision and feedback across surgical specialties. J Surg Educ. 2023;80(8):1158-1171. https://doi.org/10.1016/j.jsurg.2023.05.012.

  32. Mash B, Edwards J. Creating a learning environment in your practice or facility. South Afr Family Pract. 2020;62(1).

  33. Erumeda NJ, George AZ, Jenkins LS. Evaluating postgraduate family medicine supervisor feedback in registrars’ learning portfolios. Afr J Prim Health Care Fam Med. 2022;14(1).

  34. Pereira EA, Dean BJ. British surgeons’ experiences of mandatory online workplace-based assessment. J R Soc Med. 2009;102(7):287–93.

    Article  Google Scholar 

  35. van Der Vleuten CPM, Schuwirth LWT, Driessen EW, Govaerts MJB, Heeneman S. Twelve Tips for programmatic assessment. Med Teach. 2015;37(7):641–6.

    Article  Google Scholar 

  36. Sathekge MM. Work-based assessment: a critical element of specialist medical training. South Afr Med J. 2017;107(9):728.

    Article  Google Scholar 

  37. Doyle L, Brady A-M, Byrne G. An overview of mixed methods research—revisited. J Res Nurs. 2016;21(8):623–35. https://doi.org/10.1177/1744987116674257.

    Article  Google Scholar 

  38. Virginia B. Victoria Clarke. Using thematic analysis in psychology. Qualitative Res Psychol. 2006;3(2):77–101. https://doi.org/10.1191/1478088706qp063oa.

    Article  Google Scholar 

  39. Nowell LS, Norris JM, White DE, Moules NJ. Thematic analysis: striving to meet the trustworthiness Criteria. Int J Qualitative Methods. 2017;16(1):1609406917733847.

    Article  Google Scholar 

  40. Jafri L, Siddiqui I, Khan AH, Tariq M, Effendi MUN, Naseem A, et al. Fostering teaching-learning through workplace based assessment in postgraduate chemical pathology residency program using virtual learning environment. BMC Med Educ. 2020;20(1):383.

    Article  Google Scholar 

  41. Breckenridge JP, Gray N, Toma M, Ashmore S, Glassborow R, Stark C, et al. Motivating change: a grounded theory of how to achieve large-scale, sustained change, co-created with improvement organisations across the UK. BMJ Open Qual. 2019;8(2):e000553.

    Article  Google Scholar 

  42. De Klein RAM, Mainhard MT, Meijer PC, Pilot A, Brekelmans M. Master’s thesis supervision: relations between perceptions of the supervisor–student relationship, final grade, perceived supervisor contribution to learning and student satisfaction, Studies in Higher Education, 2012;37(8):925–939. https://doi.org/10.1080/03075079.2011.556717.

Download references

Acknowledgements

This study was made possible by a University Capacity Development Grant from the Department of Higher Education, Republic of South Africa.

Funding

This study was supported by the Department of Higher Education and Training, Republic of SA, via the University Capacity Development Grant.

Author information

Authors and Affiliations

Authors

Contributions

TR designed the study, co-wrote the lit review, wrote the methods section including designing the data collection tools, participated in and reviewed the data analysis, and provided editorial comment on the final manuscript. ED co-wrote the literature review, analysed the data, wrote the first draft and prepared the final manuscript for author review. LJ, RC and VS reviewed, commented, provided editorial guidance and suggested additional references. VB reviewed, provided editorial and conceptual guidance and contributed to the discussion. All other authors facilitated data at their respective facilities.

Corresponding author

Correspondence to Tasleem Ras.

Ethics declarations

Ethics approval and consent to participate

This study complied with the prescripts of the declaration of Helsinki and the Department of Health (South Africa) 2015. All participants were medical professionals who provided voluntary informed consent before inclusion in the study. Participants were protected through strict anonymisation of data, protection of their personal information, maintenance of confidentiality throughout the process, and a no-risk guarantee of withdrawal from participation at all times. Identifying elements were removed from the dataset by the project lead before being shared with the rest of the project team. Ethics approval was granted by Faculty of Health Sciences Human Research Ethics Committee at the University of Cape Town, HREC Reference number: 459/2022.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary Material 1

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ras, T., Stander Jenkins, L., Lazarus, C. et al.We just don’t have the resources”: Supervisor perspectives on introducing workplace-based assessments into medical specialist training in South Africa. BMC Med Educ 23, 832 (2023). https://doi.org/10.1186/s12909-023-04840-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12909-023-04840-x

Keywords