EEF Research Papers

The EEF Research Paper Series is a collection of open access research papers relevant to EEF evaluations and the wider research community.

The papers are not externally peer-reviewed. They are published electronically and are freely available online or through email distribution.

For each paper, we publish an 'EEF response', briefly setting out how we are using these findings to inform our future work.


Learning About Culture: Overarching Evaluators' Report

Authors: Jake Anders, Nikki Shure, Dominic Wyse (UCL Institute of Education) Kimberly Bohling, Alex Sutherland, Matthew Barnard, Johanna Frerichs (Behavioural Insights Team)

September 2021

This report sets out findings and lessons learned from an ambitious and innovative programme of work funded by the Education Endowment Foundation (EEF), the Royal Society of Arts (RSA), Arts Council England (ACE) and the Paul Hamlyn Foundation (PHF).

At the core of the 'Learning About Culture' programme are five school-based randomised controlled trials of arts-based education interventions delivered over four years, involving around 8,500 children in 400 state schools across England. These trials represent the biggest study of its kind ever undertaken in the UK and provide much-needed insight into both what works and how it works.

The projects have been independently evaluated by a collaboration between the UCL Institute of Education and the Behavioural Insights Team, who worked with the developer teams, the EEF and the RSA, to design trials that would maximise what we can learn from the individual studies and reveal overarching findings from the trials relevant to evaluators, funders, arts-based education organisations and schools.


Review of EEF Projects

Authors: Sean Demack, Bronwen Maxwell, Mike Coldwell, Anna Stevens, Claire Wolstenholme, Sarah Reaney-Wood, Bernadette Stiell, Hugues Lortie-Forgues.

August 2021

This report presents findings from exploratory, descriptive meta-analyses of effect sizes reported by the first 82 EEF evaluations that used a Randomised Controlled Trial (RCT) or Clustered RCT impact evaluation design published up to January 2019, as well as findings from trial-level descriptive analyses of intervention cost-effectiveness and overall pupil-level attrition. The review used a theoretical framework derived from literature with five overarching themes to group explanatory variables: Intervention characteristics; Implementation & Fidelity; Theory & Evidence; Context; and Evaluation Design. It is not appropriate to draw causal conclusions from the results of this review, since theyare based on bivariate analyses. However, the review presents associations that may be a starting point for further investigation and an innovative review framework that can support future research.

The publications include a  full report, a short summary report and a technical annex.


EEF Implementation and Process Evaluation (IPE) Quality Pilot

Authors: Sean Demack, Bronwen Maxwell, Mike Coldwell, Anna Stevens, Claire Wolstenholme, Sarah Reaney-Wood, Bernadette Stiell, Hugues Lortie-Forgues.

August 2021

The report presents an Implementation and Process Evaluation (IPE) quality measure and pilots it using data coded in the review of EEF projects (Demack et al., 2021) from the 79 EEF trial reports that included IPEs and had been published up to January 2019.


Scale-up of EEF efficacy to effectiveness trials

Authors: Sean Demack, Bronwen Maxwell, Mike Coldwell, Anna Stevens, Claire Wolstenholme, Sarah Reaney-Wood, Bernadette Stiell, Hugues Lortie-Forgues.

August 2021

The aim of this review was to support EEF, other funders, developers and deliverers scaling-up interventions, by comparing effect sizes at the efficiency and effectiveness stages for the EEF effectiveness trials that had been completed by 2019 and conducting a qualitative analyses of interviews with developers and deliverers who have been engaged in scale up from an EEF efficacy trial to an effectiveness trial. The review discusses approaches and programme features that either support or hinder intervention scale-up to large numbers of schools.


Guidance for evaluators and funders on using GCSE performance as an outcome measure

Authors: Ben Smith, Stephen Morris and Harry Armitage

July 2021

This guidance is intended for the planning stage of trials – in particular, trials for which the primary outcome measure being considered is attainment at GCSE (and in particular, GCSE English Language and Mathematics.


The effects of using examination grade as a primary outcome in education trials to evaluate school-based interventions

Authors: Ben Smith, Stephen Morris and Harry Armitage

July 2021

This paper aims to assess the impact of using GCSE grades as a primary outcome in educational evaluations and trials, compared to using marks.


Can we replicate the findings of EEF trials using school level comparative interrupted time series evaluations? Non-technical report

Authors: Sam Sims, Jake Anders, and Laura Zieger

June 2021

This report focuses on whether one particular non-experimental method can reproduce the results from experimental evaluations: the comparative interrupted time series (CITS) design. The basic idea is to compare the way in which outcomes in the treatment group deviate from trend after an intervention is introduced, relative to the way in which outcomes in the control group deviate from trend at the same point in time. Under certain assumptions, the difference between these deviations can be interpreted as the effect of the intervention.


Individual participant data meta-analysis of the impact of EEF trials on the educational attainment of pupils on Free School Meals: 2011 - 2019 

Authors: Bilal Ashraf, Akansha Singh, Germaine Uwimpuhwe, Tahani Coolen-Maturi, Jochen Einbeck, Steve Higgins and Adetayo Kasim

May 2021

EEF response: This study investigates the impact of EEF-funded trials on pupils eligible for free school meals. Although similar analysis is conducted during each individual evaluation, this report conducts a meta-analysis using data from 88 trials and over half a million pupils to reach conclusions. The report contributes to the evidence about what type of approaches may be effective at reducing the attainment gap, which influences the choice of interventions that EEF chooses to trial and scale up.


Developing a behavioural approach to knowledge mobilisation: Reflections for the What Works Network

Authors: Stephanie Waddell, Prof Jonathan Sharples

April 2020

In this What Works Network Strategic Fund project, we aimed to develop and pilot an approach to mobilising research evidence that was informed by the behavioural needs of users. It was based on the premise that by understanding the current state of practice, in addition to the current state of the evidence base, What Works centres could better address the gaps between the two. It focused on mobilising a joint piece of evidence-based guidance from the Early Intervention Foundation (EIF) and the Education Endowment Foundation (EEF) on social and emotional learning (SEL).


GCSE science as an outcome measure: the capacity of the Deeper Thinking intervention to improve GCSE science grades

Authors: Ben Smith, Andrew Boyle, Stephen Morris

EEF Research Paper Series, No. 004, February 2020

EEF response:National test results, such as GCSE grades, are an outcome measure for numerous EEF evaluations. However, any intervention aiming to improve grades must operate via improving students’ marks. This paper examines the impact an intervention could have in terms of improving students’ marks in their GCSE Science exams and implications for a trial’s MDES. 

In a first step, the authors scrutinised GCSE papers to determine how many marks could plausibly be gained due to the intervention. Mark distributions were then simulated to assess what mark gain was likely in practice. Finally, implications for MDES and sample size calculations were examined. 

This work has been conducted as part of the Deeper Thinking project and can inform MDES and sample size calculations of future EEF evaluations by considering the impact of marks. 


Teachers' engagement with research: what do we know? A research briefing

Authors: Matt Walker, Julie Nelson, Sally Bradshaw, Chris Brown 

Research Brief May 2019

This research briefing summarises findings from a nationally representative survey of schools and teachers, which investigated teachers’ research use. The survey was designed with reference to the principles adopted in an earlier 2014 study in which the National Foundation for Educational Research (NFER) and the Education Endowment Foundation (EEF) developed a Research Use Survey (RUS) (Nelson et al., 2017). The new survey took the most effective elements of the RUS, and augmented these with recent knowledge about research engagement and use. The findings are based on survey results from 1,670 teachers in England. The survey was administered between 19 September and 12 November 2017. 


Does the classroom level matter in the design of educational trials? A theoretical & empirical review.

Author: Sean Demack

EEF Research Paper Series, No. 003, May 2019

EEF response: This study examines some of the theoretical and empirical implications which account for how pupils are naturally organised in ‘classes’, which are in turn clustered within schools. The EEF recognises the contribution made by this study and invites evaluators to consider accounting for the ‘class’ clustering in their proposed designs. This paper provides suggestions of cases when this may be more relevant. Due to a limited number of studies included in this review, additional evidence is necessary.


Properties of commercial tests in the EEF

Authors: Rebecca Allen, John Jerrim, Meenakshi Parameshwaran, Dave Thompson

EEF Research Paper Series, No. 001, February 2018

EEF response: This study analyses some of the properties of major commercial assessments used by the EEF in large trials. We have used the results of this study to update some of the assumptions fed into sample size calculations and inform the choice of appropriate and cost-effective commercial assessments to be used as pre-tests. 


Standard Deviation as an outcome on interventions: a methodological investigation

Authors: Peter Tymms, Adetayo Kasim

EEF Research Paper Series, No. 002, February 2018

EEF response: In line with its goal of narrowing the attainment gap in England, the EEF is considering methods to monitor the differential effects that interventions might have on pupils of varying levels of ability. A change in the dispersion of outcomes of the intervention group would be indicative of differential effects. Based on the results of this study, the EEF updated its Analysis Guidance to request evaluators to report the standard deviations by trial arm before and after the intervention. 


Measuring Teachers' Research Engagement: Finding from a pilot study

Authors: Julie Nelson, Palak Mehta, Jonathan Sharples, Calum Davey
Report and Executive Summary, March 2017

Despite recent policies to support evidence-informed teaching, and a number of important practical developments we still don’t know a great deal about the current extent or depth of evidence-informed practice across schools in England. This paper presents findings from a survey co-developed by the National Foundation for Educational Research (NFER) and the Education Endowment Foundation (EEF), which captured information about this issue in late 2014.

The survey was developed to provide a measure of research engagement across a series of projects, funded by the EEF, which aim to increase schools’ awareness, and use, of research evidence. The survey has also informed the EEF’s overall approach to scaling-up and mobilising evidence – a key priority for the organisation in the second five years of its life.

It suggests that at this point, academic research was having only a small to moderate influence on decision making relative to other sources, despite teachers generally reporting a positive disposition towards research. Additionally, it suggests this positive disposition towards research, and perceptions of research engagement, were not necessarily transferring into an increased conceptual understanding of research knowledge.