AAO Foundation Endowment fund

To read the full-text of this research, you can request a copy directly from the authors.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

Full-text available
Evidence suggests that adapting teaching responsively to pupil assessment can be effective in improving students' learning. However, existing studies tend to be small-scale, leaving unanswered the question of how such formative assessment can operate when embedded as standard practice. In this study, we present the results of a randomized trial conducted in 140 English secondary schools. The intervention uses light-touch training and support, with most of the work done by teacher-led teaching and learning communities within schools. It is, therefore, well-suited to widespread adoption. In our pre-registered primary analysis, we estimate an effect size of 0.09 on general academic attainment in national, externally assessed examinations. Sensitivity analysis, excluding schools participating in a similar program at baseline, and complier analysis both suggest a larger effect size of 0.11. These results are encouraging for this approach to improving the implementation of formative assessment and, hence, academic attainment. Our findings also suggest that the intervention may help to narrow the gap between high and low prior attainment pupils, although not the gap between those from disadvantaged backgrounds and the rest of the cohort.
Since the 1990s, there have been repeated calls for the systematic use of randomised controlled trials (RCTs) to inform educational decision-making. The advent of the Education Endowment Foundation (EEF) – described as England’s What Works Centre for Education – in 2011 has made this a reality in England: by 2020, over a third of English schools were involved in such trials. Despite much debate about the value and role of RCTs, less attention has been paid to one specific effect of such trials, a phenomenon we refer to as interventionisation. This article uses two examples, focused on language and literacy education and teacher professional development, to demonstrate how increased use of trials may work to ‘interventionise’ education through channelling the focus of innovation and development to tightly structured interventions and generating a series of narrowing effects. It argues that a broad view of research and a diversity of methodologies is needed not only to generate rich understandings of educational practice but to develop and sustain educational provision that is fit for a dynamic world and which responds to the challenges and opportunities presented in complex educational contexts.
ResearchGate has not been able to resolve any references for this publication.