Ethics in the age of AI: An academic’s perspective

Professor Marion Oswald was awarded an MBE in 2022 for services to digital innovation, and her professional life has always been focused on law and technology.

From her early career as a solicitor, Marion saw first-hand a period of complete transformation in terms of how we all live and work with technology. Following stints as in-House Counsel and European General Counsel for software giants Apple and McAfee respectively, she witnessed the beginning of the public focus on privacy issues we see today. Her next role in government national security focussed on the use of technology and acquisition of data. 

Marion’s research has explored public sector use of data, particularly for criminal justice, policing and national security, as well as issues around the appropriate sharing of data. She emphasises the importance of thinking carefully about how data is used and questioning the implications of Artificial Intelligence (AI) or machine learning for data analysis. Her work considers AI outputs, its use in decision making, and the legal obligations and ethical concerns involved.

Marion is motivated by the aspiration to make public services more effective and trustworthy. She chairs the Data Ethics Committee for West Midlands Police and explained that bodies such as this one aim to be a facilitator, not a barrier, to the effective, legal and ethical use of data and technology. 

Marion views her advisory roles as equally important and rewarding as her academic roles. She was a Specialist Adviser to the House of Lords Justice and Home Affairs Committee on their inquiry into new technologies and sat on the independent Advisory Board of the Centre for Data Ethics and Innovation, a government expert body focused on the trustworthy use of data and AI.

For Marion, the ADR UK Conference 2023 could be key in helping to navigate the grey area between operational and statistical usage of administrative data and is looking forward to delivering the first of four keynote speeches at the conference.


Beth: You’re presenting at the ADR UK Conference 2023 on the theme of ‘Ethics, Law and Social Implications.’ Why do you feel this is an important subject to highlight?

Marion: This is an important conference, especially in the current environment where we are seeing a huge amount of debate around data uses, control of social media companies and online material, and governance and use of AI.

There are so many different perspectives coming forward, which is making it hard for everyone to get a grip of what the real issues are and how we should deal with them. This conference represents an opportunity to connect people to real, long-term expertise about the use of data. 

Beth: Is there an aspect to this conference – or any conference for that matter – that you particularly enjoy the most?

Marion: What I really enjoy is that whoever you're speaking to, you always get something back from that audience. You learn something from the questions or interactions that the conference facilitates, and I look forward to learning much from this conference.


Beyond work, Marion takes regular breaks thanks to her red fox Labrador. She is a proud mum to her musical son, who she says inspires her to return to her own musical interests in the future.

In addition to her ongoing research and advisory work surrounding policing and AI technologies, Marion is also currently looking into the use of polygraphs and Emotion AI technology in criminal justice, and its inadvertent consequences.

Recent publications include:

About the conference

You can hear more from Marion at the ADR UK Conference 2023, which will take place in Birmingham from 14 – 16 November. The conference will bring together people involved in the use of administrative data for public good research, including researchers, data scientists, civil service analysts and those involved in making this data available for research.

Tickets are almost sold out. Please visit the ADR UK Conference 2023 website to learn more and secure your place.

Share this: