EEF Blog: Generating evidence - how we’re getting to grips with the complexities of educational evaluation

Today, 9 June, the EEF is holding our annual evaluators' conference in London. Here, our head of evaluation, Triin Edovald, explores some of the issues and themes that will be addressed by the participants at this year's event.

The EEF has commissioned 139 evaluations of high-potential projects involving nearly 900,000 pupils across schools since its inception in 2011. This is a major achievement on the journey to raising the attainment of the poorest pupils across the country.

These rigorous evaluations are partnerships: between our grantees who bring us their ideas to test out; the schools, colleges and nurseries which volunteer to try them out; and our panel of independent evaluators whose pioneering use of randomised controlled trials (RCTs), with a linked implementation and process evaluation to understand the elements of successful intervention delivery, have generated such significant evidence on ‘what works’ to inform teachers and senior leaders.

Thanks to this collective endeavour bridging research and practice, the EEF is funding more RCTs in education than any other organisation globally. Indeed, in its first five years, the EEF has more than doubled the amount of available evidence from trials in education in this country, and has commissioned more than 10 per cent of all known trials in education around the world.

Today’s conference: Its aims

The EEF has held an annual conference for its evaluators since 2013. We are pleased to hold our fifth conference today, Friday 9th June, in London. Building on the success of our previous conferences, the event will bring together our evaluators to:

  • learn about the communication of evidence to non-specialist audience;
  • explore opportunities to use the EEF's Data Archive for further research;
  • engage in wide-ranging discussions on evaluation challenges, including testing, analysis and reporting; and
  • meet a community of peers to learn from and share experiences with.

This year’s keynote presentation will be delivered by Michael Blastland – a freelance writer and broadcaster best known in statistical circles for starting More or Less on BBC Radio 4 - on the subject of communicating evidence to non-specialists..

Whole school and leadership programmes: Ways to test ‘what works’

The conference features a workshop on the evaluation of complex whole school and leadership programmes, guided by the forthcoming review on the topic undertaken by a group of researchers from UCL Institute of Education, Behavioural Insights Team and Education Datalab.

The driving force behind this review has been the fact that, to date, the EEF has found it difficult to evaluate the impact of whole school approaches. This is mainly because such approaches don’t lend themselves easily to RCTs – the more complex an intervention and the period over which the outcomes are to be measured, the more challenging an RCT would be to run.

However, effective leadership is likely to be a key element of successful whole school approaches. As one of our aims is actively to encourage more high-quality applications focused on leadership, we aim to learn more about the best ways to evaluate their whole school impact.

The workshop will focus on two key elements of the upcoming review: multi-stage evaluation protocols and quasi-experimental designs (QEDs) in impact evaluation. The evaluators will get a chance to learn how such protocols would help to address the issue of emergent outcomes in complex whole school interventions. Also, there will be an opportunity to find out more about the potential of QEDs to evaluate such interventions in those instances where RCTs are not feasible.

We hope the conference will be a day of learning and sharing, as we get to grips with the complexities of evaluating, robustly and fairly, some of the best and most innovative ideas of those working at the front-line in our schools, nurseries and colleges.