Skip to main content

Using mixed methods evaluation to assess the feasibility of online clinical training in evidence based interventions: a case study of cognitive behavioural treatment for low back pain



Cognitive behavioural (CB) approaches are effective in the management of non-specific low back pain (LBP). We developed the CB Back Skills Training programme (BeST) and previously provided evidence of clinical and cost effectiveness in a large pragmatic trial. However, practice change is challenged by a lack of treatment guidance and training for clinicians. We aimed to explore the feasibility and acceptability of an online programme (iBeST) for providing training in a CB approach.


This mixed methods study comprised an individually randomised controlled trial of 35 physiotherapists and an interview study of 8 physiotherapists. Participants were recruited from 8 National Health Service departments in England and allocated by a computer generated randomisation list to receive iBeST (n = 16) or a face-to-face workshop (n = 19). Knowledge (of a CB approach), clinical skills (unblinded assessment of CB skills in practice), self-efficacy (reported confidence in using new skills), attitudes (towards LBP management), and satisfaction were assessed after training. Engagement with iBeST was assessed with user analytics. Interviews explored acceptability and experiences with iBeST. Data sets were analysed independently and jointly interpreted.


Fifteen (94 %) participants in the iBeST group and 16 (84 %) participants in the workshop group provided data immediately after training. We observed similar scores on knowledge (MD (95 % CI): 0.97 (−1.33, 3.26)), and self-efficacy to deliver the majority of the programme (MD (95 % CI) 0.25 (−1.7; 0.7)). However, the workshop group showed greater reduction in biomedical attitudes to LBP management (MD (95 % CI): −7.43 (−10.97, −3.89)). Clinical skills were assessed in 5 (33 %) iBeST participants and 7 (38 %) workshop participants within 6 months of training and were similar between groups (MD (95 % CI): 0.17(−0.2; 0.54)). Interviews highlighted that while initially sceptical, participants found iBeST acceptable. A number of strategies were identified to enhance future versions of iBeST such as including more skills practice.


Combined quantitative and qualitative data indicated that online training was an acceptable and promising method for providing training in an evidence based complex intervention. With future enhancement, the potential reach of this training method may facilitate evidence-based practice through large scale upskilling of the workforce.

Trial registration

Current Controlled Trials ISRCTN82203145 (registered prospectively on 03.09.2012).

Peer Review reports


Low back pain (LBP) is one of the largest challenges facing public health systems in the western world [1]. Cognitive behavioural (CB) approaches are recommended for the management of non-specific LBP [2]. The 2009 National Institute for Health and Care Excellence (NICE) guideline for non-specific LBP [3] fell short of making strong recommendations for a CB approach due to a lack of evidence. However, since the publication of the NICE guidance an additional eight randomised controlled trials (RCTs) have reported effect sizes that support use of a CB approach to manage LBP [4]. Thus, it is now widely accepted that LBP should be managed with a programme that utilises a CB approach. Moreover, the UK National Spinal Taskforce has identified the provision of such programmes to be the most serious gap in current LBP service provision that should be urgently addressed [5].

Implementing new evidence-based approaches to care often requires clinicians to learn new skills and change their consultation behaviours [6, 7]. For allied health professionals, this means learning how to use a CB approach in clinical practice such as the optimal dosage, delivery mode, and combination of treatment components. Addressing this, the CB Back Skills Training programme (BeST) follows a set structure and provides clinicians with detailed guidance in a manual about how to deliver a group-based programme to patients [811]. BeST is underpinned by the broad CB approach literature and provides an evidence based approach in sufficient detail to allow implementation [12]. In the original pragmatic trial, we provided a 2-day face-to-face training workshop, along with a detailed manual and materials to support the programme sessions. We are now seeking to achieve wide-scale implementation beyond the pragmatic trial in which it was initially evaluated.

This translation and implementation of research products into the clinical setting provides a number of challenges for researchers. Research teams are often small, and providing an implementation strategy scalable to national and international demand, without the financial underpinning provided by research grants, is almost impossible. Hence, we have developed an online training programme (iBeST) to disseminate BeST materials and provide training in a CB approach. This scalable method places less demand on resources, is not geographically constrained, and offers greater flexibility to the learner [13]. However, due to a paucity of research on the use of online methods for delivering training in psychologically informed treatments to allied health professionals, the feasibility and acceptability of this method needed to be explored as part of a staged implementation plan toward providing national and international access to the BeST training and intervention materials.


In line with the Medical Research Council’s guidance for the development of complex interventions, this study aimed to explore the feasibility and acceptability of training physiotherapists with iBeST prior to a larger scale evaluation of effectiveness [14]. Therefore, using a face-to-face workshop as a gold-standard reference, we wanted to explore the potential effect of iBeST on learning outcomes, as well as ascertaining physiotherapists’ acceptance of and satisfaction with iBeST. Secondary objectives included examining physiotherapists’ use of and experiences with iBeST and monitoring uptake of a CB approach (BeST) in clinical practice.



A mixed methods evaluation, consisting of an exploratory randomised controlled trial and individual semi-structured interviews, was conducted between May 2013 and December 2013. Ethical and governance approval was granted from the University of Warwick’s Biomedical and Scientific Research Ethics Committee (reference number 244-10-2012).


Participants were volunteers that responded to a substantial email request distributed through research network mailing lists and managerial staff in NHS Trusts. While the BeST programme can be delivered by nursing, allied health, and psychological professions, we concentrated on physiotherapists since they provide the majority of LBP care in the UK NHS [15]. Eligible participants were NHS physiotherapists managing a LBP patient caseload, based in Warwickshire or neighbouring counties, with access to the Internet. We did not exclude participants based on any prior training or current practice behaviours.

Brief description of the BeST programme

The BeST programme is underpinned by a CB approach and consists of an initial individual session of 60 min, and six group sessions of 90 min with 5 to 10 patients per group. It uses patient-specific needs to guide goal setting and treatment planning. It provides education about persistent pain, the importance of regular exercise, the relationship between activity levels and pain, and the role of unhelpful thoughts and behaviours in the maintenance of LBP. It teaches patients a range of skills including problem solving, goal setting, baseline setting, relaxation, thought challenging, planning for flare ups, activity pacing, and activity progression. Additionally, patients collaboratively negotiate a tailored exercise programme to do at home. Each group session follows the same structure, and begins with agenda setting and a review of homework, covers 1–2 session topics, has a 10 min break halfway through, and ends with homework setting.

The randomised controlled trial (Current controlled trials ISRCTN82203145)

Participants giving their informed consent were randomised to receive iBeST or a face-to-face workshop according to a computer generated random number sequence that was stratified by centre. The allocation sequence was concealed in sealed opaque envelopes and was held offsite and administered by an external, independent researcher. Participant blinding was not possible due to the nature of the interventions.

Description of interventions

Apart from the mode of delivery, we took care to ensure that both training methods were the same, including: the knowledge content, skills training, and training resources (therapist manual, session narratives and crib sheets, patient workbook, and additional information sources).

Face-to-face workshop

Participants randomised to the workshop attended for two days of face-to-face training, replicated from the original BeST trial [9], at the University of Warwick in May 2013. In brief, the training consisted of PowerPoint presentations, video clips, role-play scenarios, discussion and feedback. Participants were issued with a training pack that contained all slides, the therapist manual, and patient workbook. They also had access to a website where they could download additional paperwork only.


Development: iBeST content was produced in Adobe Captivate and hosted within the virtual learning environment, Moodle (, Perth, Australia). Constructivism, which states that learners actively construct knowledge through gaining understanding, and that new knowledge can only be built upon current understanding [16], was the predominant theory underpinning the organisation of online content. Course features included self-directed reading, reflective practice, skill rehearsal, multiple-choice questions, formative tests with feedback, interactive exercises, a discussion forum, and multimedia. There were 10 core modules to complete.

Procedures: The course was designed to take an equivalent learning time to that of the workshop (10 h). Participants could pace the course to their own preference, and did not have to complete it over a set time (i.e., two days). Participants were emailed a username, password, and start-up guide. We requested that they completed the programme within 6-weeks, which was accessible 24 h/day. Following course completion, participants maintained programme access.

We encouraged participants in both groups to implement the BeST programme after completing the training; however, this was not enforced.

Outcome measures


We collected baseline demographics, including gender, job title, time worked in profession, age range, degree of experience with a CB approach, and training preference before randomisation.

Outcome measure timings

All outcomes, excluding the assessment of clinical skills, were collected from participants immediately after they had completed the training. For those in the workshop comparison, this meant completing questionnaires before leaving the workshop venue. For participants in the iBeST group, this meant completing the questionnaires online within one week of finishing all modules. Since an assessment of clinical skills required the participants to set-up the intervention in their clinical practice, we allowed a 6-month time frame from completing the training within which to set up the intervention. Thus, the exact time of assessment for this outcome post-training was variable for each participant.

Learning outcomes

We aimed to assess two aspects of knowledge: (i) theoretical Knowledge of CB approach and (ii) procedural knowledge of how to deliver a CB approach in clinical practice. Since no validated or specific questionnaire was available to assess this, we developed a bespoke multiple-choice questionnaire (scale 0–31, lower score indicates lower knowledge). Questions to assess theoretical knowledge of a CB approach were derived from the background teaching of the training and focused on the CB model and its applications. To assess procedural knowledge, questions concerned how aspects of a CB approach could be delivered in relation to the BeST programme.

For participants who delivered a CB approach in clinical practice, we audio recorded a single treatment session and assessed clinical skills with the 15-item Cognitive Therapy Scale-Revised-Pain (CTS-R-Pain; scale 0–6, lower score indicates lower skill level) [17]. This tool has been specifically modified to measure competency in the use of a CB approach among non-psychology specialists. Hansen et al. [18] found the tool to have high internal consistency (Cronbachs α = 0.99) and good inter and intra-rater reliability (intra-class correlation coefficient for intra-rater reliability = 0.92 (0.79; 0.97). One rater (unblinded) with training in a CB approach (HR) scored all recordings and a senior blinded rater with comprehensive experience in a CB approach (ZH) doubly assessed 25 % of recordings.

In addition to assessing knowledge and clinical skills, we assessed participants’ attitudes, self-efficacy, and satisfaction as recommended in Kirkpatrick’s training evaluation model [1921]. Attitudes and beliefs towards the management of persistent LBP were assessed pre and post training with the 31-item Pain Attitudes and Beliefs Scale for Physiotherapists (PABS-PT). The use of a CB approach aligns itself with psychosocial attitudes and beliefs towards the management of LBP. The PABS-PT is well validated and has two subscales: biomedical (includes 14 items) and psychosocial (includes 6 items) which are rated with a Likert scale (range 1–5) [22]. A lower score on the biomedical scale indicates lower biomedical orientation and a lower score on the psychosocial scale indicates lower psychosocial orientation. Self-efficacy to deliver the (i) single BeST individual and (ii) six group sessions was assessed with two Likert scales (scale 0–10, lower score indicates lower self-efficacy). The six group sessions formed the majority of the BeST programme. We assessed training satisfaction (acceptability) with a custom single-item measure (scale: 1–5, lower score indicates lower satisfaction).

To assess therapist’s engagement with the iBeST programme we examined the number of training slides visited, the time spent on each slide, and the number of resources (links/downloads) accessed (score range: 1–3; 3 indicated higher engagement). To interpret the scores we labelled participants who scored in the upper tertile of engagement scores as having higher engagement. Use of the BeST programme in practice was assessed by whether the participant implemented the BeST programme within a 6 month timeframe in their clinical setting.

Sample size

This study aimed to explore the feasibility and acceptability of iBeST. Sample sizes of 24–50 participants have been recommended for assessing the feasibility of an intervention [23, 24]. Therefore, we considered a sample size of 30 participants to be sufficient to explore the feasibility and the potential effect of iBeST on learning outcomes. Allowing for a 15 % drop out rate, a total of 35 physiotherapists were recruited.

Statistical analysis

We used the equivalent of intention to treat analysis, including all eligible randomised participants who provided follow up data. For continuous outcomes, between group differences were explored using the Students t-test [25]. Mean change in PABS-PT scores were adjusted to account for baseline score using an analysis of covariance (ANCOVA) [26]. Effect sizes were calculated with Hedges’ g with adjustment for small sample bias. Categorical outcomes were analysed using Fishers exact test for association [26]. Statistical significance was set at 0.05 and all effect estimates were provided with 95 % confidence intervals [25]. Descriptive statistics were used to report learner analytics.

Exploratory analyses

Evidence suggests that training preference may impact on user engagement and satisfaction [27]. Therefore, in a pre-specified sub group analysis we stratified our results according to training preference (received preference/had no preference versus did not receive preference). Additionally, we explored whether engagement with iBeST impacted on learning (higher engagement versus all others). Analyses were summarised descriptively (mean and standard deviation) and were not subject to statistical testing.

The interview study

Face-to-face semi-structured interviews were completed with iBeST-trained physiotherapists to explore their experiences of using iBeST. We aimed to interview all participants trained with iBeST in the RCT (=16).

The interview guide

The initial interview guide was informed by two recent systematic reviews [28, 29] and a theoretical framework of online learning strategies to enhance health care professionals learning experience [30]. The guide was modified as the analysis of interviews progressed to ensure it was responsive to the data and to enable exploration of emerging themes. One researcher (HR) completed all interviews, which were audiotaped, transcribed verbatim and anonymised for analysis.

Interview data analysis

Interview transcripts were analysed using an inductive thematic analysis drawing upon constructivist grounded theory (open coding and constant comparison). Open coding identified the range of concepts used by participants and resulted in categories, ensuring identified themes were grounded in the data [31]. Constant comparison and close attention to deviant cases facilitated assessment of relationships between categories [32]. Transcripts were coded by the lead author (HR) and emerging codes were discussed with a senior researcher (EW), who independently coded three transcripts.

During analysis it became apparent that data relating to theme three (training impact) were congruent with Kirkpatrick’s training evaluation model, so we considered and contextualised this theme in relation to this model [19].

Data integration

Integration of both quantitative and qualitative data is often cited as the heart of mixed methods research [33]. Both data sets were analysed concurrently, independently of each other, and were jointly interpreted. After assessing complementarity of the data sets, qualitative data was used to illuminate and expand upon quantitative outcomes to achieve a more comprehensive and meaningful understanding of the feasibility and acceptability of iBeST [34].


RCT findings

A minimum of 235 health care professionals received the study advertisement through research network mailing lists (n = 220), and direct contact with NHS Trust physiotherapy managers (n = 15). From which, 58 responded and 35 were recruited into this study from 8 NHS Hospital Trusts, and were subsequently randomised to receive iBeST (16 therapists) or a face-to-face workshop (19 therapists). Participant flow is provided in Fig. 1. Eighty-nine percent of participants completed training and provided follow up data (n = 31; iBeST: 15/16; workshop: 16/19). Twelve (34 %) therapists implemented the BeST programme in practice and thus, were able to provide data for the clinical skills assessment.

Fig. 1

Randomised controlled trial participant flow

While the workshop group had a higher proportion of males, all remaining baseline characteristics from the randomised sample were broadly similar (Table 1). Although preferences for face-to-face training were well matched across the groups, there were small differences in those with no preference and with an online preference. However, the overall small sample size makes these difficult to evaluate. Additionally, due to the small numbers in some of the cells, for example ‘Training in CBT (answer: no)’, a difference of one person between the groups makes the proportion appear higher.

Table 1 Baseline characteristics of participants by allocation

Outcomes are reported in Table 2 and summarised below.

Table 2 Mean difference in outcome measures between groups


We found no statistically significant difference between training groups on knowledge (Mean Difference (MD) 0.97, 95 % CI −1.33, 3.26)). Scores ranged from 19.5 to 30 in the iBeST group, and from 20 to 30.5 in the workshop group.

Clinical skills

As shown in Fig. 1, this outcome could only be assessed in the participants that delivered the BeST programme in clinical practice. Of these participants, we found no statistically significant difference between training groups (MD 0.17, 95 % CI −0.2, 0.54).


Table 2 shows that participants trained with iBeST reported lower self-efficacy to deliver the single BeST individual assessment session (MD 1.73, 95 % CI 0.43, 3.03). However, self-efficacy to deliver the majority of the BeST intervention, the six group sessions, was similar in both groups (MD 0.25, 95 % CI −1.7, 0.7). The score range for self-efficacy (individual assessment session) was reported as 4.4 to 9.7 in the workshop group, and 1.5 to 9.7 in the iBeST group.


There was a large and statistically significant between group difference observed in the PABS-PT biomedical subscale in favour of the workshop (MD −7.43, 95 % CI −10.97, −3.89), indicating that workshop participants held less of an orientation towards a biomedical treatment approach after training. We observed a small between group difference in favour of the workshop on the PABS-PT psychosocial subscale, suggesting a greater psychosocial orientation after training, although this difference was not statistically significant (MD: 3.35, 95 % CI: −0.19, 6.89).

Training satisfaction

We observed a statistically significant between group difference of nearly 1-point on the 5-point scale in favour of the workshop group (MD 0.95, 95 % CI 95 % CI 0.52, 1.39). The majority of iBeST users were ‘satisfied’, and the majority of workshop participants were ‘very satisfied’.


Twelve of thirty-five therapists (34 %) delivered the BeST programme in their practice (iBeST 5/19, 31 %; workshop 7/15, 37 %) at five of eight sites (62 %). Every site had participants from both training groups and we saw no significant difference in the proportion of therapists delivering the BeST programme by training method (p = 0.41). Participants delivering the intervention were more senior than those not delivering the intervention, being older (p = 0.013) and having worked for longer in their profession (p = 0.001; 95 % CI: −22.0; −4.0).

iBeST engagement

The mean time spent using the online course was 6 h and 48 min (range (hr: mm): 1: 32 to 15: 49). Overall, compliance with the online programme was high. Half the participants (n = 8) completed 100 % of the course. Three participants accessed less than 50 % of each module. No participants used the online discussion forum.

Exploratory analyses

Participants allocated to their preference were the most satisfied in both training arms. Higher engagement with the course (according to the learner analytics) corresponded with higher knowledge scores (MD (95 % CI) 3.06 (0.08; 6.03)), greater self-efficacy to deliver the majority of the programme (MD (95 % CI) 1.97 (0.27; 3.67)), larger increases in PABS-PT psychosocial subscale score (MD (95 % CI) 2.26 (−1.1; 7.01), and greater decreases in PABS-PT biomedical subscale score - as desired (MD (95 % CI) -3.51 (−9.26; 2.23).

Interview study

The fifteen therapists from the RCT (=16) who completed iBeST were invited to take part in an interview, eight of whom consented to be interviewed. Figure 2 details participants flow through the interview study.

Fig. 2

Interview study participant flow. Legend: *Attempts were made via email in the first instance and subsequently via telephone. Efforts to contact participants ceased after four attempts

The sample captured a range of characteristics including age, prior CBT training, satisfaction with the training, and prior training preference (Table 3). All interviewed participants were classified as having higher engagement with iBeST and were female. Data analysis revealed three overarching themes: (1) preconceptions of online learning, (2) reflections on training experience with iBeST, and (3) impact of training with iBeST. These themes are presented in Table 4 with sub-categories and supporting participant quotes, and are described below.

Table 3 Interview study participant characteristics
Table 4 Themes, sub-categories and quotes from interview participants

Preconceptions of online learning (prior to iBeST)

Participants were initially sceptical that iBeST could provide the training needed to deliver the BeST programme. This scepticism arose from negative past experiences with online training, from their professed learning style, and from the perceived nature of the skills required to deliver BeST (such as needing to practice the group format). For example, prior to the training, participant #226 thought it was “ridiculous to learn BeST online as you needed to be able to interact with people”.

Reflections on training experience with iBeST (during iBeST)

Therapists identified a number of barriers to engaging with iBeST that were categorised into external and internal factors. Externally, barriers included technical difficulties in gaining access to the online training programme due to out of date web browsing software on NHS Trust computers, distractions when working in the home environment, and trying to prioritise the training over other aspects of their work load. Internal barriers included sitting for long periods, self-discipline not to skip sections and ‘cheat’, lone working without the capacity for any face-to-face discussion, concentration on a computer screen, and their openness to change. For example, participant #337 noted that you could “take quite a lot of shortcuts” with an online course, skipping the course content and “going straight to the test”. Conversely, one therapist felt this training method allowed greater integration and application of learning within their clinical practice.

Impact of iBeST training

Therapist reactions

The majority of participants completing iBeST (n = 11 of 15; 73 %) were satisfied with the training, and found it engaging. For example, participant #257 said “I don’t normally go home and do any work but I didn’t find it a problem going home and keeping going because it was interesting.” For one participant (#258) who was unsatisfied with the training, iBeST was unable to provide the desired level of clinical skills practice. The remaining three participants were neither satisfied nor dissatisfied with iBeST.

Therapist learning

Participants referred to improvements in their knowledge of persistent pain and made reference to holding a better understanding of behavioural skills, such as pacing and goal setting. For example, participant #289 found goal setting “…quite interesting…because I don’t actually do goal setting…I’m probably getting better at that.” Participants also discussed learning of more practical skills. For example, participant #337 spoke about use of a facilitative delivery style:

“I think that’s certainly made me think about it differently, starting to think, “Well, these are the exercises we’d maybe like to do, but it’s up to you to choose where to start,” and I like that side of it.”

Participants reported varying degrees of self-efficacy to deliver the BeST programme. While two participants’ felt very confident to deliver BeST, “I think having had the training in it…I felt much more confident to deliver it” (#258), the majority were less confident, particularly around unfamiliar topics, such as delivering the initial individual session “…we don’t really feel confident doing the individual session because it’s so different…” (#289).

Of the participants that discussed their attitudes and beliefs towards the management of LBP, the majority reflected biomedical attitudes: “…I don’t necessarily think that we can just put them straight into that group…because obviously there’s going to be lots of muscle dysfunctions and joint stiffness…” #226.

Therapist behaviour

Participants were anxious about transferring new knowledge and skills to practice, particularly for cognitive-behavioural topics, such as thoughts and feelings. For example, discussing this topic, participant #289 said: “…it was obvious I was rubbish at it…and even once I knew I was wrong, I couldn’t necessarily see why.”


Interpretation of results

This is the first study to explore an online method to build competency in physiotherapists’ use of a CB approach. Mixed methodology enabled us to not only quantify how iBeST performed across a range of learning outcomes; it also provided insight into contextual and unanticipated experiences, captured through interviews. This study did not find large or important differences in outcomes of knowledge and clinical skills, thus suggesting that iBeST may provide sufficient knowledge and skills training to deliver the BeST programme. However, we did observe large changes in attitudes towards the management of LBP among participants in the workshop group, which were not replicated by those trained with iBeST. Importantly, while participants that trained with iBeST were not as highly satisfied as those in the workshop, the majority were still satisfied and found the training method acceptable, providing evidence for the continued use of iBeST. Through integration of qualitative and quantitative methods, we identified strategies that could enhance future versions of iBeST. In particular, we identified strategies that may help to (i) improve user satisfaction and engagement, (ii) improve self-efficacy to use a CB approach, and (iii) achieve a greater change in attitudes and beliefs similar to that observed in the face-to-face workshop.

Findings in relation to the literature

The similar results on knowledge and skills between both training methods are in line with the largest systematic review to date in this area [13], which reported that online methods were equally effective to alternate forms of training for these outcomes across different health care professions and settings. Further, there has been a growing body of evidence advocating online training as a comparable, if not superior, method for delivering training to health professionals in complex interventions. For example, Dimeff et al. [35] randomised 150 psychologists to receive either a written manual, an online course, or a face-to-face workshop in Dialectical Behaviour Therapy. The online training resulted in a statistically significant greater gain in knowledge, with no other difference between the online and face-to-face training, and with both of these methods outperforming the manual. Similarly, Maloney et al. [36] evaluated the effectiveness of online versus face-to-face training for the prescription of falls prevention exercise. They classified this as a complex intervention since it incorporated a broad range of practical skills including decision making, hands-on skills, and high-level communication. They randomised a range of health care professionals (n = 135) to the two training arms, and found no differences between the two groups, reporting comparable satisfaction, knowledge scores, and self-reported changes in clinical practice.

With regards to learner engagement, literature suggests that more engaged users achieve better learning outcomes [37]. Our pragmatic division of engagement scores also support this basic learning principle. This study expands current knowledge concerning engagement of users, identifying barriers to engagement such as prioritising training (due to increased flexibility) and self-discipline that are applicable to all authors of online materials. In addition to the influence of these identified factors on engagement, the presence of face-to-face training itself may have influenced engagement and satisfaction with iBeST among learners perceiving the workshop to be more useful [29].


This feasibility study is limited by its small sample, particularly for the assessment of clinical skills, and while the statistical results give an indication of potential effect, the study was not powered to determine effectiveness. Additionally, the small sample size makes it challenging to confirm that randomisation resulted in a random distribution across the groups. However, the randomisation procedure was implemented without difficulty and there was no evidence of subversion. Therefore, the quantitative results should be interpreted cautiously in the context that this feasibility study provides initial data from which to enhance and further evaluate iBeST. Another limitation is that the assessment of clinical skills from audio recordings was conducted by a unblinded rater. However, to reduce potential bias, 25 % of recordings were doubly assessed using a blinded rater. Similar to other learning courses, the knowledge test was bespoke and, while we assessed face validity with experts in the field, other clinometric properties are not known. Lastly, uptake of the BeST programme in clinical practice was low in both groups, indicating that both forms of training were not adequate to support use in practice in their current format. However, since implementation was not specifically targeted and was simply observed, we think it was encouraging to see at least a third of participants in both groups adopting it in clinical practice.

With respect to the interview study, we used a volunteer sample that may not have been fully representative of the whole sample as our data suggested that the included volunteers were more engaged with the online programme than those declining to take part. Additionally, all interviewees were female, reducing the representativeness of the sample.

Implications and future work

The results from this evaluation suggest that online learning is a feasible and acceptable method for providing training on a large scale in evidence-based complex interventions such as the BeST programme. The mixed methods employed in this study identified areas where iBeST needs to be enhanced, and a relevant user group (physiotherapists) provided information on how it could be improved. In particular, qualitative data identified that expansion of education on specific cognitive-behavioural elements and inclusion of more skills-based practice would improve therapists’ self-efficacy and satisfaction. This finding is consistent with one of the only studies to explore physiotherapists’ use of a CB approach in clinical practice, which found that despite a 4-day face-to-face workshop with ongoing mentorship, physiotherapists had difficulties adopting some cognitive behavioural aspects of a CB approach programme for patients with osteoarthritis [38]. These results highlight the need to improve upon current strategies for teaching and supporting physiotherapists to deliver a CB approach regardless of training method.

Future work should refine and further evaluate iBeST based on feedback from participants in this study. Additionally, evidence-based implementation strategies should be explored to ascertain how iBeST can be enhanced to support implementation and maximise its impact on clinical practice.


This evaluation suggests that iBeST is a viable option for widespread dissemination of the BeST training and programme materials that warrants further refinement and evaluation. Using mixed methods enabled us to identify and explore key areas of iBeST that need enhancing. In recognition of the urgent need to manage LBP with CB approach, iBeST provides a promising avenue for building competency on a large scale in physiotherapists’ management of LBP to tackle this growing public health concern.


CB, cognitive behavioural; CI, confidence interval; Hr, hours; LBP, low back pain; MD, mean difference; Mm, minutes; NHS, National Health Service; NICE, National Institute of Health and Care Excellence; PABS-PT, Pain, Attitudes, and Beliefs scale for Physiotherapists; RCT, randomised controlled trial; UK, United Kingdom


  1. 1.

    Balagué F, Mannion AF, Pellisé F, Cedraschi C. Non-specific low back pain. Lancet. 2012;379(9814):482–91.

    Article  Google Scholar 

  2. 2.

    Airaksinen O, Brox J, Cedraschi C, Hildebrandt J, Klaber-Moffett J, Kovacs F, Mannion A, Reis S, Staal J, Ursin H. Chapter 4 European guidelines for the management of chronic nonspecific low back pain. Eur Spine J. 2006;15:s192.

  3. 3.

    Savigny P, Watson P, Underwood M. Early management of persistent non-specific low back pain: summary of NICE guidance. BMJ. 2009;338.

  4. 4.

    Richmond H, Hall AM, Copsey B, Hansen Z, Williamson E, Hoxey-Thomas N, Cooper Z, Lamb SE. The effectiveness of cognitive behavioural treatment for non-specific low back pain: a systematic review and meta-analysis. PLoS One. 2015;10(8), e0134192.

  5. 5.

    Carvell J. The spinal services taskforce report “commissioning spinal services—getting the service back on track”, The department of health spinal taskforce. 2013.

    Google Scholar 

  6. 6.

    Grimshaw JM, Eccles MP, Lavis JN, Hill SJ, Squires JE. Knowledge translation of research findings. Implement Sci. 2012;7:50.

    Article  Google Scholar 

  7. 7.

    Herschell AD, Kolko DJ, Baumann BL, Davis AC. The role of therapist training in the implementation of psychosocial treatments: a review and critique with recommendations. Clin Psychol Rev. 2010;30(4):448–66.

    Article  Google Scholar 

  8. 8.

    Lamb SE, Hansen Z, Lall R, Castelnuovo E, Withers EJ, Nichols V, Potter R, Underwood MR. Group cognitive behavioural treatment for low-back pain in primary care: a randomised controlled trial and cost-effectiveness analysis. Lancet. 2010;375(9718):916–23.

  9. 9.

    Lamb SE, Lall R, Hansen Z, Castelnuovo E, Withers EJ, Nichols V, Griffiths F, Potter R, Szczepura A, Underwood M. A multicentred randomised controlled trial of a primary care-based cognitive behavioural programme for low back pain. The Back Skills Training (BeST) trial. Health Technol Assess. 2010;14(41):1–253. iii-iv.

  10. 10.

    Lamb SE, Mistry D, Lall R, Hansen Z, Evans D, Withers EJ, Underwood MR. Group cognitive behavioural interventions for low back pain in primary care: extended follow-up of the Back Skills Training Trial (ISRCTN54717854). Pain. 2012;153(2):494–501.

    Article  Google Scholar 

  11. 11.

    Knox CR, Lall R, Hansen Z, Lamb SE. Treatment compliance and effectiveness of a cognitive behavioural intervention for low back pain: a complier average causal effect approach to the BeST data set. BMC Musculoskelet Disord. 2014;15:17.

    Article  Google Scholar 

  12. 12.

    Hansen Z, Daykin A, Lamb SE. A cognitive-behavioural programme for the management of low back pain in primary care: a description and justification of the intervention used in the Back Skills Training Trial (BeST; ISRCTN 54717854). Physiotherapy. 2010;96(2):87–94.

    Article  Google Scholar 

  13. 13.

    Cook DA, Levinson AJ, Garside S, Dupras DM, Erwin PJ, Montori VM. Internet-based learning in the health professions: a meta-analysis. Jama. 2008;300(10):1181–96.

    Article  Google Scholar 

  14. 14.

    Craig P, Dieppe P, Macintyre S, Michie S, Nazareth I, Petticrew M. Developing and evaluating complex interventions: the new medical research council guidance. BMJ. 2008;337:a1655.

    Article  Google Scholar 

  15. 15.

    Maniadakis N, Gray A. The economic burden of back pain in the UK. Pain. 2000;84(1):95–103.

    Article  Google Scholar 

  16. 16.

    Mayes T, De Freitas S. Review of e-learning theories, frameworks and models, JISC e-learning models desk study. 2004. p. 1.

    Google Scholar 

  17. 17.

    Hansen Z. The competence of physiotherapists to deliver a cognitive behavioural approach for low back pain. Warwick: University of Warwick; 2014.

    Google Scholar 

  18. 18.

    Hansen ZKJ, Lamb SE. Adaptation and psychometric testing of the CTS-R for use in chronic low back pain. Leeds: British Association of Behavioural and Cognitive Psychotherapies Annual Conference; 2012.

    Google Scholar 

  19. 19.

    Kirkpatrick DL. Techniques for evaluating training programs. Classic Writings Instructional Technol. 1996;1(192):119.

    Google Scholar 

  20. 20.

    Ajzen I. Theory of planned behavior. Handb Theor Soc Psychol. 2011;1:438.

    Google Scholar 

  21. 21.

    Cane J, O’Connor D, Michie S. Validation of the theoretical domains framework for use in behaviour change and implementation research. Implementation Sci. 2012;7(1):37.

    Article  Google Scholar 

  22. 22.

    Houben R, Gijsen A, Peterson J, De Jong P, Vlaeyen J. Do health care providers' attitudes towards back pain predict their treatment recommendations? Differential predictive validity of implicit and explicit attitude measures. Pain. 2005;114(3):491–8.

    Article  Google Scholar 

  23. 23.

    Sim J, Lewis M. The size of a pilot study for a clinical trial should be calculated in relation to considerations of precision and efficiency. J Clin Epidemiol. 2012;65(3):301–8.

    Article  Google Scholar 

  24. 24.

    Julious SA. Sample size of 12 per group rule of thumb for a pilot study. Pharm Stat. 2005;4(4):287–91.

    Article  Google Scholar 

  25. 25.

    Bland M. An introduction to medical statistics. Oxford: Oxford University Press; 2000.

    Google Scholar 

  26. 26.

    Field A. Discovering statistics using IBM SPSS statistics. London: Sage; 2013.

    Google Scholar 

  27. 27.

    King M, Nazareth I, Lampe F, Bower P, Chandler M, Morou M, Sibbald B, Lai R. Conceptual framework and systematic review of the effects of participants' and professionals' preferences in randomised controlled trials, National coordinating centre for health technology assessment. 2005.

    Google Scholar 

  28. 28.

    Cook DA, Levinson AJ, Garside S, Dupras DM, Erwin PJ, Montori VM. Instructional design variations in internet-based learning for health professions education: a systematic review and meta-analysis. Acad Med. 2010;85(5):909–22.

    Article  Google Scholar 

  29. 29.

    Wong G, Greenhalgh T, Pawson R. Internet-based medical education: a realist review of what works, for whom and in what circumstances. BMC Med Educ. 2010;10(1):12.

    Article  Google Scholar 

  30. 30.

    Carroll C, Booth A, Papaioannou D, Sutton A, Wong R. UK health care professionals' experience of online learning techniques: A systematic review of qualitative data. J Contin Educ Health Prof. 2009;29(4):235–41.

    Article  Google Scholar 

  31. 31.

    Charmaz K. Constructionism and the grounded theory method, Handbook of constructionist research. 2008. p. 397–412.

    Google Scholar 

  32. 32.

    Bernard HR, Ryan GW. Analyzing qualitative data: systematic approaches. Thousand Oaks: Sage; 2010.

    Google Scholar 

  33. 33.

    Bazeley P. Editorial: Integrating data analyses in mixed methods research. J Mix Methods Res. 2009;3(3):203–7.

    Article  Google Scholar 

  34. 34.

    Bryman A. Barriers to integrating quantitative and qualitative research. J Mix Methods Res. 2007;1(1):8–22.

    Article  Google Scholar 

  35. 35.

    Dimeff LA, Koerner K, Woodcock EA, Beadnell B, Brown MZ, Skutch JM, Paves AP, Bazinet A, Harned MS. Which training method works best? A randomized controlled trial comparing three methods of training clinicians in dialectical behavior therapy skills. Behav Res Ther. 2009;47(11):921–30.

    Article  Google Scholar 

  36. 36.

    Maloney S, Haas R, Keating JL, Molloy E, Jolly B, Sims J, Morgan P, Haines T. Effectiveness of Web-based versus face-to-face delivery of education in prescription of falls-prevention exercise to health professionals: randomized trial. J Med Internet Res. 2011;13(4):e116. doi:10.2196/jmir.1680.

    Article  Google Scholar 

  37. 37.

    Morrison C, Doherty G. Analyzing engagement in a web-based intervention platform through visualizing log-data. J Med Internet Res. 2014;16(11), e252.

    Article  Google Scholar 

  38. 38.

    Nielsen M, Keefe FJ, Bennell K, Jull GA. Physical therapist–delivered cognitive-behavioral therapy: a qualitative study of physical therapists’ perceptions and experiences. Phys Ther. 2014;94(2):197–209.

    Article  Google Scholar 

Download references


We would like to acknowledge Bethan Copsey for her assistance in the interpretation of data and help with reviewing drafts of the manuscript, and Catherine Richmond who held the concealed allocation sequence and randomised therapists in the RCT.


This work was completed as part of a doctoral thesis that was funded by the West Midlands Strategic Health Authority. Further support for this work was provided by the National Institute for Health Research (NIHR) Collaboration for Leadership in Applied Health Research and Care (CLAHRC) Oxford at Oxford Health NHS Foundation Trust. The views expressed are those of the author(s) and not necessarily those of the NHS, the NIHR or the Department of Health. No funders had any involvement in the design, data collection, analysis or interpretation of the work.

Availability of data and materials

The quantitative data sets supporting the results of this article are included within the article and its additional files (Additional files 1 and 2). Qualitative data (interview transcripts) contain personal identifiable information and consent was not obtained to publicly share their transcripts.

Authors’ contributions

HR, SL, EW, ZH, DD contributed to the conception and design of this work. ZH delivered the face-to-face training. HR performed all data collection. HR, AH, SL, EW, and DD contributed to the analysis and interpretation of data. HR drafted the manuscript. AH, SL, and EW helped to draft the manuscript. All authors read and approved the final manuscript.

Authors’ information

HR has a background in physiotherapy and is currently an early career researcher. She has training in a CB approach and led the development of the online training software (iBeST) studied in this article. Her epidemiological stance as a researcher is pragmatic, believing in methodological pluralism with no allegiance to one particular school of thought. In line with this pragmatic stance, HR appreciates the strengths and weaknesses of both qualitative and quantitative approaches and advocates that the research question should determine the methods that are employed. Thus, while believing that it is essential to have objective methods to measure and quantify phenomena, she does not believe it is possible to study and understand phenomena with pure objectivity, free from any subjective influence.

Competing interests

One author trains therapists in the use of a CB approach (ZH). Five authors have published in the field of a CB approach for LBP (SL, ZH, EW, HR, AH).

Consent for publication

All participants provided consent for their data to be published.

Ethics approval and consent to participate

Ethical approval was granted by the University of Warwick’s Biomedical and Scientific Research Ethics Committee (BSREC; reference number 244-10-2012). All participants provided written informed consent, which was also audio recorded verbally prior to each interview.

Author information



Corresponding author

Correspondence to Helen Richmond.

Additional files

Additional file 1:

Randomised controlled trial baseline dataset.pdf. Baseline quantitative dataset for participants in the randomised controlled trial. (PDF 75 kb)

Additional file 2:

Randomised controlled trial follow-up dataset.pdf. Follow-up quantitative dataset for participants in the randomised controlled trial. (PDF 72 kb)

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (, which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Richmond, H., Hall, A.M., Hansen, Z. et al. Using mixed methods evaluation to assess the feasibility of online clinical training in evidence based interventions: a case study of cognitive behavioural treatment for low back pain. BMC Med Educ 16, 163 (2016).

Download citation


  • Low back pain
  • Cognitive behavioural
  • Online training
  • Implementation
  • Dissemination
  • Physiotherapy
  • Mixed methods
  • E-learning
  • Evidence-based practice
  • Psychological