Reanalysing education experiments with linked administrative data

This report from the Behavioural Insights Team presents results from new analyses of completed randomised experiments that tested educational interventions. The project team, led by Alex Sutherland, linked archived trials data from the Education Endowment Foundation to administrative data, and demonstrates the benefits of this kind of analysis.

Read publication

Why it matters

We cannot build knowledge of policy effectiveness without data, and policy cannot wait years for outcomes and analysis. Combining administrative data with randomised controlled trials brings both short- and long-term benefits. In the short-term, it may allow trial teams to understand if and how trial participants progress with the systems they interact with (do they remain in education and how consistent is their attendance if so?). This means that rather than waiting to look at educational attainment several years post-intervention, we can look at whether attendance is affected in the short-term, including negatively, by the new intervention. In the long-term, we can better understand whether policies designed for one specific outcome generate additional benefits, involve trade-offs between outcomes, or have no impact on other outcomes of interest.

Working in this way to link data quickly to see and share interim results does not have to be limited to inter-governmental projects. What Works Centres could also be using a similar approach to track intermediate outcomes more quickly. The volume of trials completed and underway in What Works Centres, if reanalysed, could quickly build a substantial body of knowledge about what is or is not effective across multiple outcomes that benefit their target populations now and in the future.

Share this: