Funding research to improve science education - what have we learnt?
The EEF has worked with the Wellcome Trust since 2012 to improve science education. Here, Ben Simuyandi and guest author Asimina Vergou reflect on some of the lessons that we - and project developers - have learnt.
When the EEF fund a trial, people sometimes focus on whether the intervention ‘works’ or does not ‘work’. But a different way to think about the trial is as part of a process.
This process starts when educators or researchers identify an issue. They think about interventions that might address the issue. The work that we fund is one way to learn whether every part of an intervention works as expected, and what needs to be developed further.
The EEF and Wellcome have jointly funded work in three areas. First, how to use neuroscience in education. Second, how to close the science attainment gap. And third, how to improve science teacher retention.
We received around 200 proposals for approaches and interventions that could tackle these challenges from which we funded 14. We aimed to trial approaches that were ready to be evaluated using a Randomised Controlled Trial (RCT). But we also wanted to look at promising approaches that may need some more development before they were ready for a trial.
In this blog we discuss on some of our lessons relating to how to design and trial an education research project. We also report on how some of the developers have used the findings of the trials.
Trialling at small scale to build the evidence base
Our work reinforced the value of funding pilots. There are many benefits to pilots. Pilots can help a team set out why they believe an intervention works, and test whether there is any evidence to support this. We can test promising ideas to build evidence about an approach and to see if these ideas are ready to be tried in more schools.
Together we funded six pilots. The evaluations found that some of the piloted ideas needed more work. Sometimes this was because the pilots were similar to what happens in schools. For example, evaluators suggested Deeper Thinking and the Science Self-Testing Toolkit may not be different enough from current practice to improve learning outcomes.
Researchers and educators that we funded spoke about the value of piloting. Teams can learn much from a pilot, even if it does not immediately lead to a larger trial. Later in the blog we will give examples of how some of the teams have used these findings.
We publish independent, rigorous evaluations to build understanding of how to improve teaching and learning.
Things to consider when working with more schools
If a project has good existing evidence or learns enough from a pilot, then it may be suitable to trial it in more schools. Two pilots, Fit to Study and Spaced Learning, were developed into a larger trials. The step up from a pilot to a RCT means moving from working in a few schools to working in 40 or 50 schools. This transition can lead to some challenges that project developers and funders need to consider.
One issue is whether enough schools are willing to try the project. For most trials the delivery and evaluation teams were able to recruit enough schools through their networks. But there are some issues to consider. Teensleep originally struggled to recruit schools because it required schools to change their timetables. This happened even though schools were interested in sleep science. The agile team was able to redesign their work and run a pilot. And enough schools were recruited to trial Fit to Study. But the design details were kept from schools until close to delivery and some dropped out as they were not prepared to carry out what was asked.
Teams must decide what level of monitoring and support they can provide schools. In pilots, teams can work closely with schools which helps them tackle issues as they come up. But this is harder with more schools. Teams have to face trade-offs that they would face if delivering at scale, something that is the aim of many interventions. The amount of training, guidance and support provided are some examples.
For example, the ASCENTS evaluation found that clear information and consistent training contributed to its success. On the other hand, the Sci-napse evaluation found that some teachers did not receive the initial training. This was a barrier to fidelity across the participating schools. In addition, the evaluations of Stop and Think, Sci-Napse and Fit to Study found that teachers did not complete lesson activities as expected. Teachers found it difficult to change their lesson structure to accommodate these interventions. Delivery teams may have been able to respond to this if they had provided more support. But this support would have had a cost.
What happened next?
Pilots and trials are part of a learning process. Their legacy includes changing how educators and researchers think and act. They may also inspire new education and research projects.
This influencing work can be done in many ways. Grantees have run dissemination events for schools and for teachers across the country. Several grantees have written about what they have learned in peer-reviewed papers and book chapters. Some have presented at academic conferences.
Where interventions showed promise, grantees considered testing them at a larger scale. For example, the Project Unlocke (now called Stop and Think) team are redeveloping their software based on feedback. When this is complete they plan to run an effectiveness trial. The development work and trial is being funded by the EEF.
Interventions that did not show immediate impact or had delivery issues have provided useful insights. Lessons from these trials have been used in new projects. For example, the Teensleep team ran further sleep health studies in schools. Fit to Study results have informed several projects, including work on the impact of physical activity on academic performance in primary school.
The projects had a profound impact on the project developers. Researchers emphasised to us the need to work with teachers to understand how neuroscience can inform their classroom practice. Asimina has written separately about working with teachers.
Epilogue-the role of the funders
The role of the funders is critical in supporting the legacy and sustainability of their findings.
For EEF and Wellcome this role took different forms. We organised a learning event to bring together the grantees to discuss their key lessons, supported further dissemination activities, and had further discussions about how to support and scale projects that showed evidence of promise.
After all, the role of the funder should go beyond the traditional model of providing funds to being an advocate and a communicator that enables sharing of learning and stimulating reflections.
By joining forces our ambition was not only to increase the available funding but to also combine our networks, expertise and dissemination capacities to produce useful evidence in the fields of science education and education and neuroscience.