Future Foundations Summer School
The Future Foundations Society CIC (Future Foundations) summer school programme is a literacy and numeracy catch-up intervention which provided extra schooling in the summer holidays. Pupils attending the four-week programme followed a specially designed curriculum involving regular literacy and numeracy lessons taught by trained primary and secondary school teachers. Lessons were supported by mentors and peer-mentors and generally conducted in small teaching groups. Each afternoon, pupils participated in a variety of sports and enrichment activities. The programme took place across three sites in London and the South East: Brighton, Enfield and Islington in the summer of 2013. It was targeted at pupils in Years 5 and 6 who were eligible for free school meals (FSM) and at pupils who had not achieved Level 4 in English or maths at the end of Key Stage 2.
In 2012, Future Foundations developed, organised and piloted the summer school on one site, in Enfield. Recommendations from the formative evaluation of this pilot, also funded by the Education Endowment Foundation informed the development of the 2013 summer school.
A four-week academic summer school for Year 5 and 6 pupils.
Organising your school
The following conclusions summarise the project outcome
Attracting pupils to the summer school, and maintaining high attendance throughout the programme, was challenging.
As a result of the trial’s eventual size and the level of pupil dropout, the overall findings of the programme on English or maths are not definitive.
However, there is evidence of promise for English, particularly for FSM-eligible and Year 5 pupils, which may warrant further study.
The programme was relatively expensive. As a way of improving academic outcomes alternative approaches delivered during the school year may provide similar benefits for a lower cost.
Future evaluations could explore whether apparent gains for progress in English continue into the secondary phase.
What is the impact?
The overall result on English outcomes was an effect size of +0.17. This effect can be envisaged as suggesting that on average pupils receiving the intervention would make approximately two additional months’ progress over the course of a year compared to similar pupils who did not. This is similar to pupils’ normal rate of progress in term time.
The overall result on maths outcomes was an effect size of 0. This effect can be envisaged as suggesting that on average pupils receiving the intervention would make no additional progress over the course of a year compared to similar pupils who did not.
The evaluation identified different results for specific groups of pupils, though conclusions about groups of pupils are necessarily more tentative than the overall finding. Positive benefits were suggested in English for pupils eligible for FSM, who made two additional months’ progress on average, and for pupils in Year 5. In English, boys also appeared to benefit from the programme more than girls, making three additional months’ progress compared to one additional months’ progress for girls.
However, the intervention also appeared to have a negative impact on pupils eligible for FSM in maths. Understanding this negative impact is challenging, but it is possible that teaching of maths was of a lower quality than of English, leading to poor behaviour and disengagement, or that English requires more continuous work whereas maths is more resistant to summer learning loss.
The programme was implemented successfully on all three sites. Despite considerable efforts from the developers, a significantly smaller number of pupils attended the school than had been hoped for, with less than half of the target number of pupils signing up for the programme. A number of pupils also dropped-out once the programme had started. These challenges were particularly apparent in the numbers recruited outside of London (only 19% of target in Brighton compared to 34% of target in Islington and 44% in Edmonton), suggesting that there may be problems were the programme to be rolled out, particularly to less densely populated areas.
It is clear that many pupils enjoyed their time at the summer school, and it is possible that the programme led to non-academic benefits for the pupils and their families. The programme was popular with parents, who appreciated the free provision of academic and enrichment activities over the summer holidays.
|GROUP||NUMBER OF PUPILS||EFFECT SIZE||ESTIMATED MONTHS' PROGRESS||EVIDENCE STRENGTH|
|Boys only English||171||+0.22||+3||N/A|
|Girls only English||139||+0.09||+1||N/A|
|Boys only maths||167||+0.01||N/A|
|Girls only maths||139||-0.05||+1||N/A|
How secure is the finding?
The evaluation was set up as a small-scale efficacy trial to test the impact of the summer school. Efficacy trials seek to test evaluations in the best possible conditions to see if they hold promise, but do not seek to demonstrate that the findings hold at scale in all settings. To test this question, a future evaluation run on a larger scale in a wider variety of areas could be conducted.
The findings are based on a randomised controlled trial, with individual random allocation to groups to the summer school or a control group who did not receive the intervention. The study was designed to involve 1,000 pupils. However, the problems with attracting pupils and keeping them in the project meant that the results are substantially weakened. The actual population of pupils who fitted the criteria set out for the programme caused an issue. After this group had been exhausted recruitment took place across a wider range of schools.
Ultimately, 435 pupils signed up to the trial, which decreased the power of the study, while the high levels of dropout increased the risk that the findings are biased, as the pupils who attended the summer school may have been systematically different from those who did not. As a result, the findings cannot be taken as definitive. This study suggests that it is challenging to assess summer schools using randomised designs, as many families are unwilling to wait to know whether their child will be selected to attend.
Though caution is essential, there is some promise from the results of progress in English that might be worth pursuing in the future. The headline finding for English has been confirmed by a number of alternate analyses including regression modelling and post-test only, which strengthens the case for further work in this area. This short-term evaluation does not assess the long-term impact of attending the summer school, though long-term outcomes will be measured through the National Pupil Database in the future.
The pre-existing evidence on the impact of summer schools is very mixed. A number of studies have been conducted, most commonly in the US, but these have often been methodologically weak, meaning that further study in this area is required.
To view the protocol click here.
How much does it cost?
The cost of the approach is estimated at £1,370 per pupil. This estimate includes administration, resources and activities (estimated at £350 per pupil), salary costs and training (£835) and food and transport (£185). Estimates are based on 256 pupils attending a school on a single site.