Sarah Cheverton interviews Henry Pearce, senior lecturer in Law at the University of Portsmouth and Deputy Editor for Computer Law & Security, about the new NHS contact tracing app currently being trialled on the Isle of Wight, following a recent open letter to the government from 177 computer security and privacy experts raising concerns about ‘transparency and mission creep’.
SC: Can you tell me a little bit about yourself and your interest in privacy and data protection?
HP: I am a senior lecturer in Law at Portsmouth Law School, and an active researcher in the field of data protection law and policy. I am also the deputy editor for Computer Law & Security Review (CLSR), an international journal dedicated to technology law and practice. I have a particular interest in how the law deals with the issue of the ‘anonymisation’ of personal data, and how personal data are handled and disclosed by public authorities.
In simple terms, what are your main concerns about the app, as it’s currently outlined by the government?
There are a number of concerns associated with the app, the most significant of which being that it could essentially be used as a tool for intrusive government surveillance initiatives. The operation of the app would entail an unprecedented level of data gathering from the general public, and there are worries that the data collected might be repurposed and turned towards purposes other than fighting the spread of Covid-19, some of which may not be benign.
In the open letter, the academics state ‘We hold that the usual data protection principles should apply: collect the minimum data necessary to achieve the objective of the application’. Why is this an important principle in data protection?
The principle of data minimisation holds that personal data should be adequate, relevant, and limited to the minimum necessary for whatever purpose they have been collected. In other words, you should limit any collections (and uses) of personal data to the absolute minimum you need for your intended purpose. The logic behind this principle is that errant uses of personal data can be very harmful for affected individuals, and so minimising the amount of data collected should also minimise the chances of harmful events occuring. So, if we are to use the NHS tracing app as an example, according to the principle of data minimisation, the app should not collect/store/use any more data than is strictly necessary for the app to perform its intended purpose (i.e. contact tracing/identify possible spread of Covid-19).
Can you explain how the current app fails to meet this principle, and what the concern would be if it doesn’t?
As has recently been set out in evidence given to the House of Commons Select Committee, the app is set up in a way that potentially allows for the gathering of more data about its users than is strictly necessary for its effective operation. The app, for instance, allows for the inclusion of a user’s location data, which is not necessary for the purposes of monitoring the proximity of individuals (i.e. whether two individuals with symptoms have been in contact with one another). This gives rise to concerns relating to function creep, and again, worries about possible government surveillance initiatives. This is particularly the case as location data can be very revealing of sensitive aspects of an individual’s character/identity, and thus there is a clear need to prevent these data from falling into the wrong hands and/or being used for improper purposes.
And finally, what would you like to see happen to improve the app that would address the concerns outlined in the open letter?
The app, at present, operates on a centralised basis, whereby all data gathered by the app will be stored on a server operated by the NHS. Alerts will also be sent out to individuals from this centralised point. The notion of a centralised public authority database, housing sensitive data of a potentially huge number of individuals, once again gives rise to considerable concerns regarding possible function creep and government surveillance. My preference would be to see a shift to a ‘decentralised’ model, whereby everything would occur on a user’s device (i.e. all data gathered would be stored on, and alerts sent out by, the user’s device, rather than in/by a centralised government database). This would allow for the app to operate in a more ‘privacy-friendly’ manner, whilst not compromising its efficiency. The Apple-Google system being employed in countries such as Germany, Austria, and Switzerland represents an example of this approach being used in practice.