Administrative Data: Misuse vs. Missed Use

Categories: Blogs, Public engagement

Written by Ed Morrow 2 January 2020

In the summer of 2019, experts from government departments, public bodies, NGOs and academia came together at a roundtable organised by ADR UK (Administrative Data Research UK) and the Institute for Government (IfG), to discuss how to maximise the value of administrative data for policy-relevant research.

One of the key themes to come out of that event was the need to reframe the debate around the ethics of government departments sharing data for this purpose. So far, a large part of that debate has focused on the risk of data misuse, and how to minimise that risk while still enabling worthwhile research.

Administrative data misuse may range from the unintentional leaking of identifiable personal data – potentially into the hands of malign actors, including criminals and unscrupulous corporations – at one end, to intentional sharing with other government departments to aid politically toxic enforcement activities at the other.

The perils of data misuse have been widely discussed, (understandably) animating the media and privacy campaigners alike. ADR UK is well aware of these risks and takes best practice measures to minimise them, from ensuring all personal identifiers are removed before data is accessed via secure settings, to vetting research projects to ensure they are in the public interest.

However, securing enthusiastic public and political support for administrative data research requires more than the allaying of concerns about misuse. It also requires the ‘why should I care’ element: a compelling articulation of the full potential of this work to improve lives.

One way this potential can be better communicated is by making the public fully aware of another risk – one which I heard concisely expressed, in an audience question at a recent Data for Policy conference – as the risk of data ‘missed use’.

What is missed use?

Missed use is, essentially, what the IfG and numerous other organisations called ‘choosing ignorance’ in a recent open letter to the Secretary of State for Digital, Culture, Media and Sport. It is not sharing data for research when doing so could have given government a better understanding of the challenges society faces, why they exist, and how to solve them. Not doing so means government effectively operates with one hand tied behind its back. It means neglecting insights that could have led to smarter policy, more efficient public service provision, and better-spent taxpayer money. Or, in other words, knowing what works and what doesn’t.

What are the consequences of this missed use? 'At a micro-level, this could mean the failure of police and social services to identify when people are likely to be at risk as a result of abusive relationships, which can lead to deaths in the worst instances. At a macro level, it could mean the continuation of ineffective or counterproductive public service provision, meaning everything from educational attainment, through employment prospects, to healthy life expectancy, could be harmed or lost.

With this in mind, the ADR UK and IfG roundtable articulated a question that we must ask ourselves, politicians, and the public: “is it ethical not to use data to improve people’s lives when we can?”

By asking this question, we are asking for a judgement call: where does the optimum trade-off lie between minimising the risk of misuse, and tackling the risks of missed use? Where is the sweet spot in terms of public acceptability?

While previous public engagement work has demonstrated general support and expectation for the better use of data to improve public service provision, given the application of appropriate safeguards, it is arguable whether we yet fully and explicitly understand public attitudes to how this particular trade-off should be made. Such a question might therefore be one for further consideration in future public engagement activity.

Instinctively, it feels that many people would be keen to eliminate the risks of missed data use at the micro level. That there is also an ethical imperative and public support to do so at the macro policy level is, perhaps, the case that needs to be made.

Share this: