Translating evidence-based guidelines to improve feedback practices: the interACT case study
© Barton et al. 2016
Received: 8 July 2015
Accepted: 26 January 2016
Published: 9 February 2016
There has been a substantial body of research examining feedback practices, yet the assessment and feedback landscape in higher education is described as ‘stubbornly resistant to change’. The aim of this paper is to present a case study demonstrating how an entire programme’s assessment and feedback practices were re-engineered and evaluated in line with evidence from the literature in the interACT (Interaction and Collaboration via Technology) project.
Informed by action research the project conducted two cycles of planning, action, evaluation and reflection. Four key pedagogical principles informed the re-design of the assessment and feedback practices. Evaluation activities included document analysis, interviews with staff (n = 10) and students (n = 7), and student questionnaires (n = 54). Descriptive statistics were used to analyse the questionnaire data. Framework thematic analysis was used to develop themes across the interview data.
InterACT was reported by students and staff to promote self-evaluation, engagement with feedback and feedback dialogue. Streamlining the process after the first cycle of action research was crucial for improving engagement of students and staff. The interACT process of promoting self-evaluation, reflection on feedback, feedback dialogue and longitudinal perspectives of feedback has clear benefits and should be transferable to other contexts.
InterACT has involved comprehensive re-engineering of the assessment and feedback processes using educational principles to guide the design taking into account stakeholder perspectives. These principles and the strategies to enact them should be transferable to other contexts.
KeywordsFeedback Assessment Self-regulation Online distance learning Higher education Medical education Dialogue Postgraduate Action research
Guidelines for improving feedback practices abound. Nicol and Macfarlane-Dick  outline seven principles of good feedback including facilitating reflection and self-regulation in learning and encouraging teacher-learner dialogue. Jisc  adapted these to six principles to guide assessment for learning design, one being that feedback leads to improvement and stimulates dialogue. Boud and Associates  have encouraged curriculum developers and teachers to consider an alternate seven guidelines for improving the educational effect of assessment through engaging the learner, promoting active involvement with feedback, and placing assessment for learning at the centre of subject and program design. In a comprehensive systematic review, Evans  distilled twelve educational principles (EPs) of effective feedback and feed forward. However, the difficulty is translating such guidelines into practice.
According to Price and colleagues  the current problem with assessment design is a result of oversimplification and poor decision-making. This is despite the availability of guiding principles for assessment and feedback design. For example, a study of teaching staff in a biochemistry course identified that instructors do not change their assessment practices despite more sophisticated design thinking . More worrying is a comprehensive report describing the assessment and feedback landscape in Higher Education in the UK as ‘stubbornly resistant to change’ . Dawson et al  argue that to reduce the gap between idealised and real assessment practices, the academic community needs to engage with real, contextualised assessment decision-making. We present the following case study with this in mind.
The original programme, at the Centre for Medical Education, University of Dundee, like many others in Higher Education , utilised a monologic information transmission approach to feedback. Providing written feedback on summative assignments at the end of a module meant that as academics we did not know whether our students read, understood and utilised the feedback. Academics spending hours crafting feedback that is not read or used is an inefficient use of time. We needed a better assessment approach that promoted engagement with feedback and encouraged students towards self-regulation [10, 11]. The overall approach we developed was based on students’ self-evaluation of their own work against assessment criteria, written feedback leading to supported reflection on feedback, and student-tutor dialogue .
In this paper we showcase knowledge translation in relation to dialogic feedback and explore the challenges and insights gained as a research team which others working with programmatic assessment may find valuable. We present a case study demonstrating how a curriculum development team re-engineered an entire programme’s assessment and feedback practices, in line with good practice recommendations, using action research. A socio-constructivist perspective on feedback is increasingly being encouraged in the literature [11, 13], where learning occurs through student engagement and development of new understandings through dialogue and participation . Stemming from our dissatisfaction with unidirectional written feedback practices, we adopted the perspective that feedback should be a communicative act and a social process . Nicol argues that ‘feedback should be conceptualised as a dialogical and contingent two-way process that involves co-ordinated teacher-student and peer-to-peer interaction as well as active learner engagement’ (, pp. 503). The main purpose of feedback we posit is to develop students’ ability to monitor, evaluate and regulate their learning. The interACT (Interaction and Collaboration via Technology) project tackles the problems of monologic feedback transmission and the isolation felt by both students and assessors in an online distance learning programme in medical education .
The interACT project was implemented within the online Medical Education programme at the University of Dundee’s Centre for Medical Education. The programme enrols between 450 and 500 post-graduate students per year onto its 60 credit Certificate, 120 credit Diploma or 180 credit Masters courses. The courses are made up of 15 credit modules of which the Certificate and the Diploma each have two core modules. The Masters consists of a further 12–15,000 word dissertation. In 2013, the majority of the students on the programme were medics; 54 % were male and 70 % were from the UK and EU.
Summary of the action research cycles, objectives and methods
Cycle One (01/09/11 to 30/08/12)
Planning problem identification
01/09/11 to 29/04/12
Review of current processes and practice
Action development of materials and workflow
01/09/11 to 29/04/12
Development and technical testing of interACT
Monitoring of pilot roll out
30/04/12 to 30/08/12
Piloting of interACT
Reflection on pilot phase
30/04/12 to 30/08/12
Evaluation and reflexivity
Cycle Two (01/09/12 to 31/08/2012)
01/09/12 to 31/12/12
Revision of interACT
01/09/12 to 31/01/13
Development of modifications to interACT
01/02/13 to 31/05/13
Implementation of revised interACT
06/06/13 to 31/08/13
Evaluation and reflexivity
Cycle 1a: Planning problem identification (01/09/11 to 29/04/12)
In the planning phase we conducted a document analysis to better understand the problems associated with assessment and feedback in our programme. A textual analysis was conducted by the authors consisting of: relevant sections from external examiner reports (2006–2011); end of course evaluations (2006–2011); additional evaluation surveys conducted separately in 2010 and 2011 as part of a major curriculum review; and the Postgraduate Taught Student Experience survey. Using a narrative review approach  we identified five key problems in relation to assessment and feedback which we sought to address in the interACT project . These were: 1) inconsistency in the quality and quantity of feedback provided; 2) assessment design (e.g., over reliance on essays, limited opportunities for formative assessment); 3) timeliness of the feedback; 4) lack of assessment and feedback dialogue; and 5) isolation of students and tutors.
Feedback should be dialogic in nature
Feedback is often viewed as something that is ‘given’ to a student to correct their errors, whereas it should be seen as a process of communication which is an on-going evolving dialogue [14, 19]. Simply ‘telling’ a student about their performance does not ensure they have listened (or read the feedback), understood or acted upon it. Feedback should be seen as a social act between individuals, imbued by power, identity and gender, and taking into account respective ideas, feelings and points of view . Feedback dialogues were specifically built into the interACT process.
Assessment design should afford opportunities for feedback to be used in future assignments
Feedback should not be viewed as a single occurrence but as a series of pedagogical opportunities which takes a programmatic approach enabling evidence of learning from feedback to be documented and for feedback to serve to help improve learners’ work in the future . Hattie and Timperley’s  model highlights feedforward, related to the question ‘where to next?’, as crucial for learning. Assessment sequencing, formulation of action plans for future work and articulation of how previous feedback informed the current assignment was embedded into the new process.
Students should be empowered to seek feedback from different sources
This principle fits in with capabilities for life-long learning where graduates are required to seek external, credible sources of data to inform their performance and progress . Boud and Soler  argue that sustainable assessment (i.e., assessment that promotes lifelong learning) encourages students to make conscious comparisons between their self-evaluations and judgements by teachers, peers and other stakeholders. Research has shown that students make more complex improvements to their work after receiving feedback from multiple sources . Seeking feedback from the tutor on a specific aspect of their work promotes active reflection on the quality of the work, encourages students to define learning goals and prompts the tutor to discuss specific aspects that may not be crucial to the assessment criteria but are important to the student.
Feedback should develop evaluative judgements and monitoring of own work
Learning is enhanced when learners are self-regulating, actively engaged in setting learning goals, selecting strategies for achieving these goals and monitoring their progress toward these goals . Reflecting on feedback and processing it through self-explanation has been shown to improve self-monitoring and evaluation . InterACT prompted students to self-evaluate their assignments before submission against the assessment criteria and in comparison to tutor feedback.
Cycle 1b: Action development of materials and workflow (01/09/11 to 29/04/12)
With the four EPs and a better idea of the problems faced in our programme we set out designing interACT processes and materials. A full blueprint of the programme assessment was drawn up alongside a description of the standards, criteria for each assessment and the individuals responsible for providing feedback . Assignments were sequenced using the ESCAPE tool . The tool is a simple timeline where formative and summative assessments are represented by different colours to visualise overall sequencing and flow of assignments across an entire programme. Thus the research team were able to analyse the mix of formative and summative assignments and how the assignments were sequenced to enable feedforward (EP 2). The overlap of this project with an ongoing curriculum review enabled the revision of assessment structure to improve sequencing of assignments, to reduce the reliance on essays and to introduce more formative tasks as recommended by several authors in this field [23, 28–30].
How well does the tutor feedback match with your self-evaluation?
What did you learn from the feedback process?
What actions, if any, will you take in response to the feedback process?
What if anything is unclear about the tutor feedback?
The tutor is then automatically alerted via email when a student has completed their reflection on feedback journal entry for each assignment. The email contains a direct link to the student’s reflection allowing efficient continuation of the dialogue (EP 1).
Cycle 1c: Monitoring of pilot roll out (30/04/12 to 30/08/12)
The interACT process was introduced on the 30th April 2012 across the entire programme. Students enrolling on the programme received information about interACT in the induction module. They were educated about feedback and the reflection on feedback process by outlining the research evidence and explaining the alignment between the research principles and the strategies we have used to enact the principles. This process was crucial to establish buy-in from students and help them to get the most out of it. Alongside this we introduced clear instructions and screencasts explaining the process to the students and staff. For 4 months engagement rates with the process were calculated and queries from students were collated to inform the development of an FAQs section to be included with the interACT instructions for students on Blackboard™.
Cycle 1d: Reflection on pilot phase (30/04/12 to 30/08/12)
After the 4 month introductory phase which comprised the first of the action research cycles, the research team met to reflect on these first 4 months of implementation in terms of workflow and technical difficulties. A number of enhancements were made to the process including: reducing the number of steps required in the process for both students and staff, introducing automatic alerts allowing central management, and informing students by email when their assignments had been marked. The project learning technologist also recorded several screencasts (developed using the Articulate screenr℠) to help direct students in the use of the reflective journal. These were included in the email. Since these changes were made to the process, queries to the administrative team have dramatically reduced, in particular for the most common questions asked relating to using the reflective journal, e.g., how to upload an assignment.
Cycle 2a: Planning (01/09/12 to 31/12/12)
Figure 1 demonstrates the new model for assessment and feedback, within CME, which is intended to lead to meaningful student-tutor interaction. Re-structuring of the course assessment and feedback process included a reduction in the overall number of assignments, allowing additional time to be spent on feedback dialogue and formative tasks.
Cycle 2b: Action (01/09/12 to 31/01/13)
The interACT process was streamlined following further development and technical testing of revisions and improvements, and the revised longitudinal feedforward assessment was implemented across the Certificate, Diploma and Masters programmes e.g., an administrator now subscribes to student wikis in order that automatic alerts of any changes to the wiki are generated and these are then forwarded to the appropriate tutor.
Cycle 2c: Monitoring (01/02/13 to 31/05/13)
Aim of evaluation
Type of data
Data collection method
To evaluate engagement with the interACT process: cover page and feedback journal
Quantitative: feedback engagement survey
Feedback engagement audit
To evaluate the impact of interACT on student satisfaction and perceived value to their learning, as well as challenges and enablers to engaging with interACT
Quantitative: impact on workload, satisfaction with interACT
Qualitative: Perceptions of improvement in self-review ability, affective feeling of motivation or isolation and recommendations for change
Semi-structured interviews; online survey (via Bristol Online Survey); end of module evaluation report
To evaluate the impact of interACT on staff satisfaction and perceived value to their learning, as well as challenges and enablers to engaging with interACT
Qualitative: experiences with interACT, satisfaction and recommendations for change
Semi-structured interviews; external examiner report
To evaluate the impact of interACT on administrative staff satisfaction and workload, as well as challenges and enablers to engaging with interACT
Qualitative: experiences with interACT and nature and number of questions received from students about assessment and feedback
To evaluate transferability of the interACT project to the wider HE community
Qualitative: feedback, shared ideas and experiences
Engagement with workshops/ webinar
Feedback journal entries of all students on the online course were examined and engagement rates with the reflective journal were calculated for three consecutive 4-month periods, one prior to the streamlining changes (4 months into the process) and two following these changes. This was conducted to establish the numbers of students who were participating in the non-compulsory reflection element of the interACT process and to gauge whether rates remained the same as students progressed throughout the programme or decreased due to time pressures, apathy or lack of perceived value.
Students were invited to participate in the evaluation interviews about the interACT process via email from the project officer. A purposive sample of students at different stages of their studies were selected and interviewed. In-depth semi-structured interviews were conducted by RA and KB with a sample of seven current online distance students to better understand their perceptions of and experience with the interACT process. Questions were asked about: the purpose of feedback; the perceived value of interACT; ease of use; time required; and suggestions to improve design. Interviews were recorded and transcribed. Data analysis was informed by thematic framework analysis  starting with reading of the transcripts, negotiation of the thematic coding framework between the two researchers (KB and RA) and coding of the entire data set, followed by developing of themes. The interviews were 25 min on average and ranged from 17 to 37 min in duration.
The themes identified in the interviews were used to inform the development of an online questionnaire, which was sent to all students on the online course who had completed the induction module (n = 487). The online questionnaire used Bristol Online Surveys™ and ran from 20th December 2012 to 31st January 2013. The questionnaire included four sections: number of assignments submitted with the interACT process and technical difficulties experienced; reflections on the cover page (positive and challenging aspects); reflections on the journal (positive and challenging aspects); and overall satisfaction with the process. The questionnaire was piloted with postgraduate research students and minor modifications were made to the wording. Students were emailed the link to the survey via their @dundee.ac.uk email address on 20th December with reminders being sent on the 11th and 25th January. Descriptive and thematic framework  analyses of the data were undertaken on the quantitative and open-ended questions respectively.
In addition 10 staff members (eight tutors and two assessment administrators – all staff involved with the online programme at the time) were interviewed by KB to ascertain their views on the interACT process in terms of their engagement with and whether it encouraged dialogic feedback. Questions were asked on: the purpose of feedback; experiences of interACT; how worthwhile they felt the process was; and suggestions on how to improve engagement with students. Data analysis was informed by framework analysis  starting with reading of the transcripts, negotiation of the thematic coding framework and coding of the entire data set, followed by developing of themes by KB and RA . The interviews were 35 min on average and ranged from 22 to 43 min in duration.
Cycle 2d: Reflection (01/06/13 to 31/08/13)
Data from Cycle 2 of the project were compared with data from Cycle 1 to determine the impact of the revisions to our assessment and feedback process. Expert advice was sought from members of the reference group regarding the findings of the project and future sustainability. The reflection process highlighted the need for an educational package to be developed for students and staff to improve assessment literacy (discussed below).
Students Engaging with the Reflective Journal Feedback Process
30th April to 31st Aug 2012
1st Sept to 31st Dec 2012
1st Jan to 30th April 2013
Teaching and Learning
Principles of Assessment
Responses from students were mostly positive with students commenting on the value of the interACT process to their learning. The students reported valuing the structure provided by the cover page and the opportunity to reflect on what they had done relative to the assessment criteria. They also valued the opportunity for dialogue with staff about their work brought about by the reflective journal.
Purpose of feedback
The purpose of feedback is to help regulate my own learning decisions, as I understand it, and to give me some idea of where I am going in relation to where I am supposed to be going and help me adjust my decisions and my learning efforts to meet those goals. Interview Student 4
Structure of feedback process
It gives a structure to the feedback doesn’t it, otherwise you just ask ‘what do you think about the assignment?’ you’re just going to get ‘it was Ok’ from most people. Interview Student 7
Reflection and learning
[interACT] did force my reflective process in the end to have one last look without changing anything on my self-evaluation, it is not something that I think I would have naturally done. Interview Student 3
[With interACT] you are then a bit more mindful of what your feedback was from the first one [assignment] to any changes that you might make in the subsequent assessments. I suppose in a way before you would just think about your answer a bit and the feedback that you have got but not writing it and writing it down makes a difference. Interview Student 6
What I really like about it, is that it gives me a chance to request, to maybe direct a little bit the specificity of what kind of information I am going to get … I got quite a bit of specifics [feedback] about that which allowed me to change what I did. I think it gives me the chance to work on what I define. Interview Student 3
I have been able to engage in more of a dialogue with whoever has graded it, which has allowed a little more personality exchange and a little more support, when you feel that there is someone at the other end actually looking at what you are working so hard at and treating you as a person, because the challenge for both of us I think is that we don’t get to have face-to-face so there is no laughing over a cup of coffee. Interview Student 4
For things like style, format and language, I find that I am putting the same thing for all of them because I don’t think that there has been a big change in my style, format and language across the different essays. Interview Student 1
Fifty-four students (11 %) completed the questionnaire. Of these, 46 (85 %) thought that the instructions provided about the assignment submission process (cover page, uploading marked assignment and reflecting on the journal) were clear. Respondents found that the process of self-review and reflecting on feedback was valuable for their learning (52 %); promotes assessment and feedback dialogue (48 %); and promotes self-evaluation (63 %). Students were evenly split on whether they agreed or disagreed that the process was time-consuming.
The qualitative comments re-iterated the interviews with students reporting the value of being able to ask for specific feedback, providing an opportunity for final review before submission, a chance to express their views and a structure for submission and feedback expectations. Three challenges were reported with the process (one was related to the size and formatting, another to repetitive nature of the reflection and third related to lack of staff engagement with the students’ comments).
Initially, I probably thought, thought this is going to be a lot of work, and it is a lot of work but I think it’s beneficial to the students so it is money well spent in that sense and I think the student does get a lot out of it. Interview Tutor 3
I have had some students comment this back to me, that it is really good for them to feel that they have a dialogue with the tutor. Interview Tutor 2
Purpose of feedback
Feedback has to go beyond just correcting students’ mistakes. I think it really does have to develop their evaluative judgements capacities. So their ability to evaluate their own work, to seek feedback about it, to monitor and judge and then to develop action plans to address any identified learning needs. So it has to be about helping them to self-regulate their learning. Interview Tutor 1
Feedback should be all about helping the student to identify what they’re doing well, to identify what they need to further develop and to support them in that development. Interview Tutor 2
I think it’s overall positive. What I think is good about them, I think it’s good that it does prompt self-evaluation. I think it’s good that it gets students to think about what sort of feedback they want because that’s what we are training them to do, that’s what they should be doing after they graduate is seeking feedback. That’s what we do to calibrate our performance. Interview Tutor 1
Structure of feedback process
It encourages you to provide lengthy feedback I think and certainly the way in which, because when I in a previous job we had cover sheets but they were tick boxes then we had to make three points for improvement and it was very narrow whereas this gives you much more options to be more personalised with the feedback. Interview Tutor 7
Reflection and learning
It’s certainly improved my feedback in terms of definitely quantity and hopefully quality as well so I’m giving more information now because I’m being constantly prompted. Interview Tutor 3
I also use it for closing the feedback loop so if they tell me something’s wrong or they don’t understand it then I will also use that to say ‘thank you for highlighting that, I have now changed it’. Interview Tutor 2
Yeah definitely I think that [it’s encouraging dialogue], I just even think that the fact that you’re able to go ‘yeah I agree with you’ or ‘well no actually this is better than you think it is’ is a dialogue that you would never have had, it’s all one direction feedback in the past. Interview Tutor 7
We need to include more guidance to the students on how to engage with the cover page … it would help them if they had some examples of good engagement and good self-review because one of the reasons we’re putting this in is to develop their self-review skills, [also] we need faculty training on that about how do we develop their self-review skill. Interview Tutor 2
I would say [I get] less [questions], I think just because the feedback is more thorough. Interview Administrator 2
It’s a lot easier for those ones that I have to do the resubmissions on because the feedback is already incorporated in the attached area. I’m not having to do any copy/pasting from the original. Interview Administrator 1
They also felt that the number of technical queries had sharply declined following the development of the screencasts, streamlining of the process and the introduction of the email alerting students to their assignment being marked.
Several key findings have emanated from this project. First it is possible to translate educational guidelines and principles to inform practice specifically for feedback in an online distance learning programme, although this is not without its challenges. Changes in the assessment and feedback processes and specifically the tools (e.g., the cover page questions) resulted in positive changes in the way students and staff approached feedback. The cover page prompted an added level of evaluation against the assessment criteria and learning objectives and also encouraged students to seek specific feedback. The questions in the reflective journal prompted reflection on feedback, explication of actions for future assignments and encouragement of student-tutor dialogue. The cover page prompted staff to structure their feedback practices in accordance with the assessment rubric and to tailor feedback to students’ specific requirements. The reflective journal provided instant feedback on what students had understood from the feedback and what remained unclear, hence closing the evaluation loop. Despite the additional time requirement with interact staff were keen to see the process embedded into the programme, overall efficiency was managed by reducing the total number of summative assessments (shifting efforts from assessment of learning to assessment for learning).
Bloxham and Campbell  used interactive cover pages to promote feedback dialogue between students and tutors. They found that their students were not experienced enough to know what was expected and to initiate meaningful dialogue with their tutors. In our study, some students utilised the opportunity to seek feedback on specific aspects of their work very well, while others either ignored the question or sought feedback on ‘all of the assignment’. This is an area where students’ assessment literacy may be improved. As with Crimmins et al  who developed a written, reflective feedback process for first year undergraduate student, we found the dialogic process supportive of relationship building. Our students talked about ‘personality exchange’ and opportunity to ‘interact freely’ with the staff. This was important for us as students and staff had reported feeling isolated. Furthermore, the influence of the educational alliance on feedback receptivity is a concept starting to be explored in the literature  and based on our findings is clearly something valued by students and staff. Engagement with the feedback in the cover page and reflective journal was a particular strength of the design.
Streamlining the process was key to improving student and staff engagement, highlighting the importance of getting buy-in and evaluation from all stakeholder groups. Despite the increased time it takes for staff to engage with the feedback dialogue, all staff wanted to continue with it. This is due to perceived improvement in their own feedback practices, closing the evaluation loop, perceived value for student learning and increased tutor satisfaction in engaging with learners and seeing their feedback used. The time required by the tutor to engage with interACT has been offset by reducing the overall number of assignments that students complete on the programme. Based on the findings from the staff interviews we would advocate for a further EP to underpin interACT, that the feedback process should be used to improve teaching [1, 2].
Although engagement has been improved quantitatively, the project team would like to further improve the quality of students’ assessment literacy, evaluative judgment and their feedback-seeking ability. It is planned to tackle this through educational activities in the first core module of the programme and through feedback on the learners’ self-evaluation. Engaging in the process of providing peer review is also beneficial for developing evaluative judgement  and ways of embedding peer feedback into a non-cohort online distance learning programme are being investigated. However, the affordance of such a programme where there are no arbitrary assessment deadlines, is that students can take the time and effort needed to reflect on and address their formulated action points before submitting the next assignment, hence strengthening the feed forward element . To make the most of the process it is planned to introduce a patchwork assessment  in which students identify how they have developed across the programme to meet exit outcomes of the Certificate and Diploma programmes using evidence from their reflective journal. In this way it is believed that students will gain from further reflection on their feedback dialogues as well as it being an efficient way to assess the reflective journal.
Strengths, limitations and future research
The strong theoretical underpinning, multiple methods adopted and repeated cycle of action research are strengths. Participation in the student questionnaire was low but perhaps not surprising given that our students are busy health professionals studying at a distance. Although enrolled on the course, some of the students may not have yet submitted any assignments when the questionnaire was released and so the reported response rate is potentially lower than the actual one. Changes in evaluative judgements were not measured as students progressed through the course; this would be an interesting area to research. Further testing of the interACT process within other programmes would be important. The project team are currently conducting an interactional analysis of the feedback dialogue excerpts to shed light on how these are constructed and negotiated and how learning is mediated through feedback dialogue in an online context .
InterACT has involved comprehensive re-development of the assessment and feedback processes using EPs translated from the literature to guide the design keeping in mind the project stakeholders. These principles and the strategies to enact them should be transferable to other contexts. The action research cycles have enabled further refinement to the interACT process, to develop FAQ sections to assist students, and to provide examples of areas students ask for specific feedback on. This could be used to provide group feedback to future students, thus increasing tutor efficiency. Evaluation findings have been used to inform faculty development and will continue to be used to make improvements to the interACT process and for student and faculty development.
This work was funded by a Jisc Assessment and Feedback Strand A grant (2011–2014, £125 K). The authors would like to acknowledge the contribution of the other members of the project team: Mr Grant Murray, Dr David Walker, Mrs Natalie Lafferty and Dr Lorraine Anderson and the project Reference Group. We would also like to thank the staff and students from the Centre for Medical Education at the University of Dundee for their engagement with the development and evaluation of the interACT project.
Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.
- Nicol D, Macfarlane-Dick D. Formative assessment and self-regulated learning: a model and seven principles of good feedback practice. Stud High Educ. 2006;31(2):199–218.View ArticleGoogle Scholar
- JISC. Effective assessment in a digital age: A guide to technology-enhanced assessment and feedback. Bristol: JISC; 2010.Google Scholar
- Boud D, Associates: Assessment 2020: Seven propositions for assessment reform in higher education. In. Sydney: Australian Learning and Teaching Council; 2010.Google Scholar
- Evans C. Making Sense of Assessment Feedback in Higher Education. Rev Educ Res. 2013;83(1):70–120.View ArticleGoogle Scholar
- Price M, Carroll J, O’Donovan B, Rust C. If I was going there I wouldn’t start from here: a critical commentary on current assessment practice. Assess Eval Higher Educ. 2010;36(4):479–92.View ArticleGoogle Scholar
- Offerdahl EG, Tomanek D. Changes in instructors’ assessment thinking related to experimentation with new strategies. Assess Eval Higher Educ. 2011;36(7):781–95.View ArticleGoogle Scholar
- Ferrell G: A view of the Assessment and Feedback Landscape: baseline analysis of policy and practice from the JISC Assessment & Feedback programme. In.; 2012.Google Scholar
- Dawson P, Bearman M, Boud DJ, Hall M, Molloy EK, Bennett S, et al. Assessment Might Dictate the Curriculum, But What Dictates Assessment? Teach Learn Inquiry: ISSOTL J. 2013;1(1):107–11.View ArticleGoogle Scholar
- Nicol D. From monologue to dialogue: improving written feedback processes in mass higher education. Assess Eval Higher Educ. 2010;35(5):501–17.View ArticleGoogle Scholar
- Price M, Handley K, Millar J. Feedback: focusing attention on engagement. Stud High Educ. 2011;36(8):879–96.View ArticleGoogle Scholar
- Nicol D. Assessment for learner self‐regulation: enhancing achievement in the first year using learning technologies. Assess Eval Higher Educ. 2009;34(3):335–52.View ArticleGoogle Scholar
- Ajjawi R, Schofield S, McAleer S, Walker D. Assessment and feedback dialogue in online distance learning. Med Educ. 2013;47(5):527–8.View ArticleGoogle Scholar
- Boud D, Molloy E. Rethinking models of feedback for learning: the challenge of design. Assess Eval Higher Educ. 2013;38(6):698–712.View ArticleGoogle Scholar
- Higgins R, Hartley P, Skelton A. Getting the Message Across: The problem of communicating assessment feedback. Teach High Educ. 2001;6(2):269–74.View ArticleGoogle Scholar
- Yin RK. Case Study Research: Design and Methods. 5th ed. London: Sage; 2013.Google Scholar
- Carr W, Kemmis S. Becoming critical: education, knowledge, and action research. Lewes: Falmer Press; 1986.Google Scholar
- Bearman M, Dawson P. Qualitative synthesis and systematic review in health professions education. Med Educ. 2013;47(3):252–60.View ArticleGoogle Scholar
- Ajjawi R, Barton KL, Schofield SJ, Walker D: interACT - Interactive Assessment and Collaboration via Technology: Using technology to promote feedback dialogue with online distance learners: Baseline Report. In.; 2012.Google Scholar
- Carless D. Differing perceptions in the feedback process. Stud High Educ. 2006;31(2):219–33.View ArticleGoogle Scholar
- Ajjawi R, Rees C. Theories of communication. In: Higgs J, Ajjawi R, McAllister L, Trede F, Loftus S, editors. Communicating in the health sciences. 3rd ed. South Melbourne, VIC: Oxford University Press; 2012. p. 15–23.Google Scholar
- Hattie J, Timperley H. The Power of Feedback. Rev Educ Res. 2007;77(1):81–112.View ArticleGoogle Scholar
- Crommelinck M, Anseel F. Understanding and encouraging feedback-seeking behaviour: a literature review. Med Educ. 2013;47(3):232–41.View ArticleGoogle Scholar
- Boud D, Soler R: Sustainable assessment revisited. Assess Eval Higher Educ 2015:1–14. DOI:https://doi.org/10.1080/02602938.2015.1018133
- Cho K, MacArthur C. Learning by Reviewing. J Educ Psychol. 2011;103(1):73–84.View ArticleGoogle Scholar
- Roscoe R, Chi MH. Tutor learning: the role of explaining and responding to questions. Instr Sci. 2008;36(4):321–50.View ArticleGoogle Scholar
- Hamdy H. Blueprinting for the assessment of health care professionals. Clin Teach. 2006;3(3):175–9.View ArticleGoogle Scholar
- Russell M, Bygate D. Assessment for Learning: An introduction to the ESCAPE project. Blended Learning in Practice 2010(March):38–48.Google Scholar
- Higgins R, Hartley P, Skelton A. The Conscientious Consumer: Reconsidering the role of assessment feedback in student learning. Stud High Educ. 2002;27(1):53–64.View ArticleGoogle Scholar
- Sadler DR. Formative assessment and the design of instructional systems. Instr Sci. 1989;18(2):119.View ArticleGoogle Scholar
- Shute VJ. Focus on Formative Feedback. Rev Educ Res. 2008;78(1):153–89.View ArticleGoogle Scholar
- Schifferdecker KE, Reed VA. Using mixed methods research in medical education: basic guidelines for researchers. Med Educ. 2009;43(7):637–44.View ArticleGoogle Scholar
- Ritchie J, Spencer L. Qualitative data analysis for applied policy research. In: Bryman A, Burgess RG, editors. Analysing Qualitative Data. London: Routledge; 1994. p. 173–94.View ArticleGoogle Scholar
- Bloxham S, Campbell L. Generating dialogue in assessment feedback: exploring the use of interactive cover sheets. Assess Eval Higher Educ. 2010;35(3):291–300.View ArticleGoogle Scholar
- Crimmins G, Nash G, Oprescu F, Liebergreen M, Turley J, Bond R, Dayton J. A written, reflective and dialogic strategy for assessment feedback that can enhance student/teacher relationships. Assess Eval Higher Educ 2014;41(1):141–153.Google Scholar
- Telio S, Ajjawi R, Regehr G. The “educational alliance” as a framework for reconceptualizing feedback in medical education. Acad Med. 2015;90(5):609–14.View ArticleGoogle Scholar
- Nicol D, Thomson A, Breslin C: Rethinking feedback practices in higher education: a peer review perspective. Assess Eval Higher Educ 2013;39(1):102–122.Google Scholar
- Winter R. Contextualizing the Patchwork Text: addressing problems of coursework assessment in higher education. Innov Educ Teach Int. 2003;40(2):112–22.View ArticleGoogle Scholar
- Ajjawi R, Boud D. Researching Feedback Dialogue: An Interactional Analysis Approach. Assess Eval Higher Educ. 2015. doi:https://doi.org/10.1080/02602938.2015.1102863.Google Scholar