Data linking for the National Evaluation of the Troubled Families Programme
Categories: Blogs, Children & young people, Health & wellbeing, Housing & communities, Inequality & social inclusion, World of work
Written by 11 October 2019
ADR UK’s recent roundtable with the Institute for Government identified a number of key areas where progress must be made to ‘maximise the potential of administrative data’, including the importance of learning from existing best practice. In the first of a series of blogs exploring these key themes, Ricky Taylor and Lan-Ho Man from the Ministry of Housing, Communities and Local Government (MHCLG) share their experience from the evaluation of the 'Troubled Families' programme.
The Troubled Families programme is one of the biggest social policy programmes in government. It is funded over five years and will have cost around £920 million when it finishes in 2020. It is run across 149 upper tier local authority areas in England, and provides services to 400,000 families. The programme aims to support families to deal with multiple, high-cost problems including worklessness, domestic abuse, mental and physical ill health, school truancy and anti-social behaviour and offending, to transform how services are delivered to families, reduce demand for reactive services and lower costs for the taxpayer.
How did we evaluate the programme?
The Troubled Families Analysts were given a project brief to include all local authorities and to carry out a robust impact evaluation (with a counterfactual) and cost benefit analysis – all within a limited budget.
The tough brief for an evaluation of this scale was a catalyst for innovation as traditional data collection methods, such as surveys, would have been prohibitively expensive. We realised we needed a new approach.
We decided to use nationally held administrative data to conduct an impact analysis of the programme. The project included the collection of personal identifiers, including names, dates of birth and addresses of individuals and families either on the programme or those awaiting a place (these formed a comparison group for the evaluation) and matching this information to various datasets including the Police National Computer, National Pupil Database and Work and Pensions Longitudinal Study.
We then used propensity score matching to compare outcomes of those on the programme with those in the comparison group. Other policy programmes had been evaluated in a similar way, but never to the same scale, or in combination with so many government databases or other sources of information.
How did we get agreement to evaluate the programme in this way?
It took two years to navigate the data protection legislation and identify and agree the legal gateways to share and use the data. Indeed, we were one of the biggest users of legal services in MHCLG.
Initially we had to work closely with the Information Commissioner’s Office to ensure we had considered the rights of data subjects. We considered the ethical, legal and data security issues in full and set these out in data sharing agreements and a full Data Privacy Impact Assessment. These documents were agreed with our key partners before we could press ahead.
We learned along the way that innovation brings risk which can make people (quite rightly) wary. This meant keeping key stakeholders (such as local authorities) updated, outlining the benefits and providing reassurance on the risks.
What other challenges did the evaluation pose?
We spent quite some time quality assuring our data before carrying out our impact evaluation.
Once we started receiving data, we realised quickly that there was a lot of work to do on data quality and our Office for National Statistics (ONS) colleagues (acting as a trusted third party) worked with local authorities to improve their data. Our Technical Advisory Group were also keen to ensure that the programme and comparison group had been selected in a similar way, as this could cause problems when attributing impacts to the programme itself (establishing a causal link).
Was the project worth the effort?
Eventually our efforts paid off. The project saw one of the biggest data sharing exercises in Whitehall and we have data on 864,000 individuals in 250,000 families on the programme. We have processed several terabytes of data with rigorous security and data processing controls to ensure that the data is protected.
The national impact evaluation is robust and has demonstrated the success of the programme, showing that, after two years:
-
the programme reduced by a third the proportion of children going into care; juvenile convictions were down by 15%; juvenile custody was down by 38% and adult custody was down by a quarter; and
-
the number of working age Jobseeker’s Allowance claimants looks to have reduced by 11%.
The Cost Benefit Analysis found for every £1 invested by central government in the programme there is an economic and social (or public value) benefit of £2.28, and a fiscal benefit of £1.51.
These are impressive rates of return on a social policy programme and something to celebrate. We would not have uncovered these findings without putting a proper impact evaluation in place.
Click here if you want to read the findings of the evaluation.