EEF Blog: How the EEF identifies our Promising Projects (and what we do next)

Knowing what research results to take note of can be difficult for busy school leaders. New evidence is published all the time and, as well as the over-whelming volume, it can be hard to judge the relative quality of different programmes. In this blog, Emily Yeomansour head of programmes strategy, explores how the EEF sign-posts our most interesting and secure findings to teachers, highlighting three new programmes just added to our Promising Projects list, and discusses what we can learn from their key findings. 

How do we generate evidence about programmes?

Schools often choose to implement an external programme to help improve a specific aspect of the teaching and learning in their school. One of the EEF’s key aims is to ensure that senior leaders have good-quality evidence to base these decisions on. Since 2011 we have funded independent evaluations of 190+ high-potential projects aiming to learn more about whether each programme holds promise for improving outcomes for young people.

How do we signpost to the most promising programmes?

Programmes which have shown initial promise when trialled by the EEF are added to our Promising Projects list. Currently, it includes 19 projects.

This list provides teachers and headteachers with a good starting point for selecting programmes that they might want to try in their schools. All of the projects on this list have demonstrated the potential to improve attainment for young people cost-effectively when independently and robustly evaluated through a randomised controlled trial (RCT).

What makes a programme promising?

For a programme to be included as an EEF Promising Project, it needs to satisfy three key criteria:

  1. Impact: the programme must have secured at least one month’s additional progress for participating young people compared to the control group of pupils (ie, an effect size greater than 0.05 standard deviations);
  2. Cost: this positive impact should have been delivered cost effectively (which we define as meaning the programme costs less than £80 per pupil for each additional month’s progress); and
  3. Evidence: our trial needs to have achieved an EEF security rating of at least 3 ‘padlocks’ out of 5, meaning it has at least moderate security.

Which new EEF projects are being added to the list?

Three new programmes are being added to our Promising Projects list today:

  • 1stClass@Number – a programme which provides intensive support for pupils struggling with maths, delivered over 10 weeks by teaching assistants. In our trial, pupils in Year 2 who received the programme made +2 months’ additional progress.
  • onebillion – a programme consisting of two apps that are designed to support the acquisition of basic mathematical skills for pupils aged 3-6, delivered over 12 weeks and monitored by teaching assistants. In our trial, pupils in Year 1 made +3 months’ additional progress.
  • Improving Working Memory – a programme which aims to improve the working memory of children who were behind the class average in numeracy, delivered over 10 weeks by teaching assistants. In our trial, pupils in Year 3 made +3 months’ additional progress.

What general lessons can we learn from these three programmes?

These new additions to our list Promising Projects share a trio of interesting similarities:

  • all three are catch-up maths interventions for younger pupils struggling with numeracy
  • all three are targeted and short in duration; and
  • all three are well-structured with a clear delivery model, led or overseen by teaching assistants.

As a batch of projects, therefore, they provide a useful insight into what might work well for pupils struggling with numeracy in Key Stage 1 and at the start of Key Stage 2.

They also further bolster the growing evidence base supporting the use of teaching assistants to deliver high-quality, structured interventions to pupils who are falling behind. 

The EEF’s guidance report, Making Best Use of Teaching Assistants, provides clear and actionable recommendations on how to put this evidence into practice. 

What should schools be aware of if choosing to implement these programmes?

We believe, based on the current evidence base, each of these three new Promising Projects should be of interest to primary schools who are looking to improve maths outcomes for their struggling pupils.

As ever, though, we also stress the following caution: there is no guarantee that the positive attainment impact reported by the independent evaluator in these EEF trials will necessarily be repeated in your school.

We recommend, therefore, that if you want your school to achieve similar outcomes when adopting any of these programmes you should consider:

  • Is your school ready to implement its approach with fidelity (i.e., as designed by the programme developer)?

The effects seen in our trials are the result of schools implementing the programme as well as they possibly can and following the necessary training. It’s important, therefore, to read the information in the evaluation report about what is needed to implement the programme effectively. Ask yourself (honestly!): What training will my teachers or teaching assistants need? Will we be able to provide pupils with the necessary resources, or setting, to undertake the intervention (for example a quiet area to work in small groups)? 

Help for senior leaders on how to prepare for and embed a new approach successfully can be found in our guidance report, Putting Evidence to Work: A School’s Guide to Implementation.

  • Can your school monitor the impact of the programme on free school meal (FSM) eligible pupils?

Two of the programmes showed smaller positive impacts for FSM-eligible pupils than for all pupils and one project, onebillion, suggested a negative impact for these pupils in our trial. It’s important to stress that these findings are necessarily more tentative, as FSM-eligible pupils are a smaller sub-group of the total pupil population in our trials. Nonetheless, to ensure transparency we report the results and think it is important for schools to be aware of potential differential impacts for FSM-eligible pupils. 

We explored this issue at greater depth in Jonathan Kay’s recent blog-post, ‘Can a programme that works also widen the attainment gap?

Schools choosing to adopt any of these programmes should, therefore, think carefully about how they will correctly identify those FSM-eligible pupils who are most likely to benefit from the programme and, then, how they will monitor these pupils’ progress during the programme.

The EEF intends to test the impact of all three programmes at a larger scale. One focus of doing so will be to investigate the impact of the programmes specifically on a larger group of FSM-eligible pupils so that we can be confident these programmes are not inadvertent gap-wideners. 


Our Promising Projects list is, we hope, a good place for senior leaders to start when considering purchasing a new programme to improve pupil outcomes. 

However, there is still a need to plan for its implementation carefully and to monitor the impact it has on your pupils, particularly those from disadvantaged backgrounds. By doing so, it is much more likely you will achieve similar positive impact as was seen in the EEF’s trials.