Anglican Schools Partnership Effective Feedback

This pilot project focused on improving teachers’ understanding and use of effective feedback. Participating teachers tried to incorporate feedback into their lessons to help pupils understand their learning goals and become able to develop strategies to reach them. The project employed a cyclical action research design, through which teachers reviewed academic literature on effective feedback before developing ways to apply it in the classroom. The project took place over one school year and involved nine treatment and five comparator schools in the London Borough of Bexley. All pupils in Years 2-6 took part in the study.

Existing international research suggests that improving the quality of feedback in the classroom has the potential to improve learning significantly. However, studies also highlight the difficulties in improving feedback in practice, and there are few clear examples of how to improve feedback in English schools. This project sought to develop a way of improving feedback led by schools.

The pilot evaluation had three aims. First, to assess the feasibility and promise of an approach to improving feedback which required schools to review, understand and apply research findings, including academic papers. Second, to provide formative recommendations that could be used to improve the approach in the future. Third, to provide an initial quantitative assessment of the approach’s impact on academic attainment that could be used to inform any future trial.

Key Conclusions

The following conclusions summarise the project outcome

  1. Effective feedback has shown promise in previous studies, but this evaluation demonstrates that improving feedback consistently is challenging.

  2. The approach appeared to be most effective when training was communal and when objectives and methods were shared. It was least successful when teachers were unclear about the differences between different types of feedback, and when pupils were unable to set clear success criteria.

  3. Teachers often struggled to interpret, understand and apply findings from academic research.

  4. The study did not seek to assess impact on attainment in a robust way. However, the attainment data which was collected indicated that there may be some evidence of promise for students eligible for free school meals.

  5. One future step may be to try and develop the intervention into a more structured programme targeted specifically at low achieving pupils and pupils eligible for free school meals. Greater support, including videos of model lessons could be provided to participating teachers.

What is the impact?

The approach is feasible and there are some indications of promise. All nine schools completed the action research programme and at the end of the year many staff were receptive and enthusiastic about the approach. A number of good lessons with clear use of feedback strategies were observed. However, in common with existing studies on feedback, there was wide variation in the way that strategies to improve feedback were used.

Many teachers found it difficult to understand the academic research papers which set out the principles of effective feedback and distinguished between different types of feedback. For example, the literature on feedback draws an essential distinction between feedback targeted at the self (‘Great sentence; you are a superstar!’) and feedback which promotes self-regulation and independent learning (‘You have learned some adverbs today. Check if you could add some adverbs to improve your sentences.’). However, it was not clear in observed lessons that this distinction was consistently understood. Some teachers initially believed that the programme was unnecessary as they already used feedback effectively.

The pilot produced valuable formative information for a potential future project. In order to improve the consistency of the approach employed it is recommended that staff be provided with a large number of examples illustrating the variety of types of feedback. Video recordings of effective lessons could be used as a training resource. This approach would be likely to be more successful than one which required teachers to work from undigested evidence reports. The process evaluation also identified the need for more differentiation in the use of feedback, and a clearer explanation of the use of success criteria in lessons.

The estimated impact findings showed no difference between the intervention schools and the other primary schools in Bexley in terms of annual progress towards Level 4 at Key Stage 2 or in terms of value-added progress scores. However, due to the non-random nature of the comparison and the small number of schools involved it is difficult to draw conclusions with this. The results should not be confused with those of a full trial. Pupils eligible for free school meals made more progress in participating schools than in comparison schools. However, these findings are based on much smaller numbers and so even greater caution is required.

Is the approach feasibleYesall schools completed the project.
Is there evidence of promise MixedMany good lessons were observed but teachers struggled to understand and use the evidence on feedback consistently across all schools.
Is the approach ready for a full trialNoFurther development is required to refine the approach and provide more support to make research accessible to teachers.

How secure is the finding?

The pilot was a large-scale and in-depth study of 2,000 children receiving the intervention in Years 2 to 6 in nine primary schools. The intervention also took place in the one secondary school in the same partnership but their results are not part of this evaluation. A further 1,000 pupils acted as a partially matched comparator group in five schools. The process evaluation formed the bulk of the fieldwork, with the aim of providing formative evidence on all phases and aspects of the intervention from cascading the training to evaluating the outcomes. Additional data from observation and interviews with staff and researchers, and via focus groups and a brief survey with pupils were collected.

The impact study had a ‘before and after’ design, measuring the gains made in Key Stage scores using teacher assessment scores. Comparisons were made with results in five other local schools identified by the project lead; therefore the results must not be mistaken for those of a trial. The school-based research approach meant that causal influences could not be robustly identified; the quantitative component of the study primarily sought to provide an estimated effect size for any intervention that could be used in future trials.

To view the project's evaluation protocol click here.

How much does it cost?

This is a whole school intervention, involving 10 schools and around 4,000 pupils at a cost of around £88,000. The cost per pupil is approximately £22. This estimate includes the cost of delivering the intervention to nine primaries, and one secondary school not involved in the evaluation