Researchers from Oxford Human Centred Computing Research Group (of which EWADA is part of) and the Responsible Technology Institute provided a joint response to the public consultation on reforms to the UK’s data protection regime by the Department for Digital, Culture, Media & Sport (DCMS).

The goal of the consultation is to create an ambitious, pro-growth and innovation-friendly data protection regime in the UK that underpins the trustworthy use of data. UK data protection law is currently modelled on EU requirements, particularly the GDPR from 2016 and the 2009 ePrivacy Directive. Following the UK’s withdrawal from the European Union, the government sees a wealth of new opportunities in the reform of the current legal requirements around the protection of personal data.

Overall, as a research group, we welcome the initiative of the UK government to help researchers in handling personal data, and to spur innovation within the UK. At the same time, we are concerned about potentially harmful consequences for individuals residing within the UK, as a result of a potential weakening of UK data protection standards and rights. Our response highlighted our thoughts around the core issues of data protection and privacy, responsible research, and balancing support for organisations as well as individuals. The major themes of our response cover the following topics:

  • Data intermediaries and institutions: Lack of clarity regarding data intermediaries, institutions, and practices put in place to safeguard individuals and support technological growth.
  • AI and responsible innovation: The opportunities for AI innovation in the UK depend on a robust regulatory regime that encourages highly context-specific risk management. This will be best promoted through maintaining existing measures like Data Protection Impact Assessments, Data Protection Officers, record keeping, and prior consultation, amongst others.
  • Erosion of trust in online tracking: Excessive box-ticking in the form of consent banners is not a necessary feature of existing data protection and privacy law, but rather a symptom of non-compliance with it.
  • Removal of the balancing test: The removal of the balancing test for pre-approved legitimate interest purposes will create disproportionate risks for UK citizens, and a false sense of certainty for controllers.

In our response, we also provided recommendations and suggestions on these themes and statements, intended to help build a sustainable future of AI and data protection within the UK that not only promotes innovation but also advocates for and protects individuals.

Read our full responses here