EEF Blog: “What next?” Generating evidence, scaling-up promising approaches

Emily Yeomans and Stephen Tall look at what happens next when an EEF-funded trial receives a successful report from its independent evaluator.

The EEF has today announced it's awarding almost £4m in grants to six projects which will both generate more evidence and ensure more schools have access to promising approaches in raising the attainment of disadvantaged pupils.

They each have something in common: all six have previously received grants from the EEF for trials to test their impact, with encouraging results. Why, then, are we re-granting for new, bigger trials? Wouldn’t it be better simply to fund their expansion now, rather than put them to a further test?

It’s a fair question, and the answer gets to the heart of the EEF’s approach to helping build the evidence of ‘what works’ in a way that’s genuinely useful for schools.

When the EEF funds the trial of a promising new programme or approach we categorise it under one of four headings:

  • Pilot studies: these are conducted in a small number of schools (eg, three or more), where a programme is at an early or exploratory stage of development. They’re independently evaluated through qualitative research to develop and refine the approach and test its feasibility in schools. Initial, indicative data will be collected to assess its potential to raise attainment.
  • Efficacy trials: these aim to see whether a programme can work under ideal or developer-led conditions in a larger number of schools (eg, 10 or more). They’re evaluated quantitatively to assess impact on pupils’ attainment, with a qualitative and process evaluation helping to understand the elements of effective practice.
  • Effectiveness trials: these aim to test whether a programme can work at scale in a large number of schools (eg, 40 or more), where the developers are no longer the only deliverers. They’re evaluated quantitatively to assess impact on pupils’ attainment, with a process evaluation identifying the challenges and solutions to roll-out. The cost of the intervention at scale will also be calculated.
  • Scale-up: this is when a programme which has been shown to work when rigorously trialled, and has the capacity to deliver at scale, is expanded to work across a larger area delivering to a larger number of schools. Though we will continue to evaluate its impact, this is now a lighter touch process.

The vast majority of the 121 projects we’ve funded to date are either efficacy or effectiveness trials.

All six of the projects announced today were initially efficacy trials – that is, they were first tested under ideal conditions, with close involvement from the developer, to see if the idea can work – which reported positive results of between one and three additional months’ progress for participating pupils, according to their independent evaluators.

So, the answer to the question, “What next?”, is that the EEF is now funding their progression to the next stage, an effectiveness trial.

This means each of the six programmes will now be delivered using a scalable model that has the potential to reach a large number of schools in multiple locations. What we are testing is whether the promising results we saw under ideal conditions persist when the programme is working under more typical, ‘real world’ conditions.

For some projects, the move from efficacy to effectiveness has required altering their delivery model, ensuring that the programme is not reliant on a small number of people delivering all of the training. This is the case, for example, with Science Oxford and their ‘Thinking, Doing, Talking Science’(TDTS) programme. They will now work with a network of organisations to train TDTS trainers across England. This is a subtle, but important, change, meaning that if the effectiveness trial has a positive result there is the potential for even more schools to benefit in the future.

These six new effectiveness trials will involve over 900 schools and 48,000 pupils and will test one-to-one and whole-class approaches to teaching literacy, affordable Maths tuition, and a more practical and creative way of teaching science. Five of these projects are part of our North East Primary Literacy Campaign, a £10m programme of work to boost literacy levels for disadvantaged pupils, co-funded with Northern Rock Foundation.

Similarly, our batch of new projects last October included five re-grants for projects which had demonstrated promise when first trialled. In that case, one pilot study graduated into an efficacy trial: Visible Classroom 2015. Two others, Switch-on and Catch Up Numeracy 2015, are part of our Teaching Assistant Campaign focused on South and West Yorkshire.

The EEF’s aim is not only to generate evidence (all of which is then summarised within our Teaching and Learning Toolkit and its Early Years companion) – at least as importantly, we also want to ensure that evidence can beput to usewithin schools. That means supporting programmes with good evidence of positive impact to expand their reach: to find out if they can be delivered in more schools in a way which both maintains impact and is cost-effective.