As with many controversies, discussions of the implications of the coronavirus pandemic and privacy involves a number of myths. It will be interesting to see which survive this unprecedented period. Perhaps the most fundamental is the capacity of science and engineering to manage, control, and subdue the natural world. It had seemed to many that the developed world is largely safe from virulent disease. This false sense of security has been punctured; it is clear that this pandemic is far from over, and there may be others to come, for which we must also prepare. Nonetheless, myths are not displaced quickly, and governments still turn to a technological toolbox for quick solutions.
Surveillance is one of these technologies. Getting ahead of a contagious disease requires high quality data. If a government can identify infected persons quickly and accurately, then it can act to ensure that they get medical assistance but, more important in terms of curtailing the spread of the outbreak, it can isolate them so that they cannot infect others. This requires some system of surveillance and monitoring, with a mechanism for alerting public health authorities.
There are many legal and ethical questions in the choices which states worldwide are beginning to make in this regard. One particularly important question is whether it is better to surveil the virus (by a wide ranging system of proactive testing, including of individuals were not yet showing symptoms) or to surveil the population (so that the movements of infected persons can be traced retrospectively, and those with whom they have come into contact can be tested). The former is expensive and difficult to arrange. The latter approach, which is generally known as ‘contact tracing’, is more common.
However, privacy issues abound with this method. There are also practical problems with a digitised surveillance system, and it might not offer a solution at all. Nonetheless, many governments are using tracing apps as a way of reducing lockdown measures.
Governments worldwide have been utilising a range of different surveillance methods. To take some examples, China deployed an app that somehow calculated whether not an individual was at risk of spreading infection, thermal drones, and facial-recognition software. In the Indian state of Karnataka, individuals in self-isolation were required to upload a selfie every hour or risk government intervention to put them in quarantine. Visitors to Hong Kong were given wristbands that would alert authorities if they left their dwelling. Although jurisdictions which have deployed these surveillance measures have generally responded well to the challenge of the pandemic, and saved lives in the process, there are concerns that they may have gone too far. Digital technology can be effective in the short term, but its long-term usefulness is not clear, particularly if the way in which it is deployed damages trust and confidence in government.
Another myth is complete discontinuity: 'Everything has changed', 'the old rules don’t apply', 'emergencies justify shortcuts'. Whether or not these prove to be true in the long run we will not know for quite some time. In the short term, however, one thing is clear: existing laws and regulations continue to apply, and one of the most important of them is the General Data Protection Regulation (GDPR).
A further myth is that in the context of an emergency, we must give up our privacy in order to protect our health and well-being, or that balancing between the right to privacy and the right to health is too difficult to achieve, particularly in an emergency. This is not correct – this balancing exercise has already been done in the GDPR, as Recital 4 makes clear:
The processing of personal data should be designed to serve mankind. The right to the protection of personal data is not an absolute right; it must be considered in relation to its function in society and be balanced against other fundamental rights, in accordance with the principle of proportionality.
The European Data Protection Board have issued a detailed statement elaborating on this.
Although another myth is that European data protection law requires that individuals consent to the processing of any and all personal data relating to them, this is not the case. There are six possible legal basis for processing under the GDPR, of which consent is only one. In the context of a global pandemic, public health authorities could rely on a public interest or legal authority basis.
This does not provide them with unlimited capacity to engage in any processing which they deem appropriate. They must also follow the principles of data protection law. There are also six of these: purpose limitation (the processing must be only for the legitimate purpose for which it was originally collected), data minimisation (only data strictly required for that purpose can be requested), accuracy (data must be kept up to date), integrity and confidentiality (appropriate security measures must be applied), storage limitation (data only be kept as long as it is needed), and fairness and transparency (processing must be legitimate and the data subject must be properly informed as to what is happening with their data).
There is a further limitation. Under Article 9 of the GDPR, 'data concerning health' is one of the types of 'special category' data and cannot be processed unless one of ten possible conditions applies. For disease surveillance, several could apply but (as the Data Protection Commission (DPC) point out) the most relevant is where 'processing is necessary for reasons of public interest in the area of public health, such as protecting against serious cross-border threats to health', which must be 'on the basis of Union or Member State law which provides for suitable and specific measures to safeguard the rights and freedoms of the data subject, in particular professional secrecy'.
European data protection law remains relevant, therefore. In that context, the European Commission is seeking to co-ordinate pan-European approaches to coronavirus apps. Its Recommendation C(2020) 2296 is the first step in 'a process for developing a common approach, referred to as a Toolbox, to use digital means to address the crisis', which includes 'the use of mobile applications, coordinated at Union level, for empowering citizens to take effective and more targeted social distancing measures, and for warning, preventing and contact tracing to help limit the propagation of the COVID-19 disease', all which 'should be guided by privacy and data protection principles.' The first version of the toolbox is available, and states that national apps should be voluntary; approved by the national health authority; privacy-preserving; and dismantled as soon as no longer needed.
However, as with many other aspects of the responses of national governments and European institutions to the crisis, there has been controversy. Two quite divergent proposals for a contact tracing app emerged at European level – Pan-European Privacy-Preserving Proximity Tracing (PEPP-PT) and Decentralized Privacy-Preserving Proximity Tracing (DP3T). The most significant difference between them is the extent which they return data to a central repository, which the European Parliament’s resolution on the pandemic had demanded not happen. PEPP-PT centralises data storage, while DP3T makes a considerable effort to minimise the data that it collects and keeps it on the user’s device.
While PEPP-PT had high-level support at the outset, it has been slow to provide detailed technical information and has lost partners as a result. DP3T has been open source since the outset. New myths about the benefits of one over the other might have developed under more normal circumstances, but the current context is one where debates are truncated and narratives move quickly. The Commission has moved away from PEPP-PT and Germany has ruled out any centralised data storage, although France remains committed to the idea.
In parallel, Apple and Google are collaborating on a largely decentralised tracing system which they hope to release next month, and say that they will disable when the pandemic is over. Their lack of interest in supporting a centralised system was a significant contributor to the choices that are being made across Europe on contact tracing, underlining another myth which is increasingly challenged in the dynamic, de-regulated 'information society' – the power of government to make its own choices or to direct the private sector. Major multinational technology companies have once again shown that they are sufficiently robust to resist the entreaties of the European Union.
In Ireland, the HSE announced that it was developing a contact tracing app relatively early, but progress has been slow. The HSE has not yet published source code or a data protection impact assessment (DPIA). The latter is required under Article 35 of the GDPR '[w]here a type of processing in particular using new technologies, and taking into account the nature, scope, context and purposes of the processing, is likely to result in a high risk to the rights and freedoms of natural persons'. However, there is no obligation to publish a DPIA and they are not assessed by any independent third party, in contrast with, for example, an Environmental Impact Assessment Report which is required under European law for projects likely to have significant effects on the environment.
What information has been made available indicates that the app follows a decentralised model. Further clarity may be provided in due course, particularly as the Minister for Health has acknowledged that “[t]his will only work if the people of Ireland download the app and buy into it”.
Publication of the DPIA might help to build trust in the app. Unless it is made compulsory (which would be a drastic step in a democracy with a strong tradition of freedom of movement and no national identity card), the public must adopt it on a large scale (in the region of 60% of the population) in order for it to be useful. Individuals may be slow to do so if they are concerned about what the app does or what is done with the data which it collects. We will be living with the consequences of the decisions that are taken now for a long time; it is important to get them as right as possible. Public acceptance therefore requires attention and consideration.
In light of the controversies over the Public Services Card (PSC) and data retention, the public may want reassurance as to the government’s long-term plans. The PSC has been criticised by the DPC and by the UN Special Rapporteur on extreme poverty and human rights, and is essentially temporarily suspended during the crisis, but still seems to be an important project for the state. The government is also defending the legality of the retention of telephone metadata in a reference to the Court of Justice of the European Union, although the former Chief Justice, Mr Justice John Murray, has suggested that the state should consider whether they should continue to operate this scheme. Security concerns may also be a barrier to adoption; the Dutch government has just recommended the deletion of the NL-Alert disaster alert app, as it emerged that it has a data leak. A contact tracing app could easily have similar issues.
While the Irish public have been quite compliant with the stringent and unprecedented restrictions that were imposed rapidly and with little debate, fatigue may set in. Some system of monitoring and tracking may be required in order to begin to lift the lockdown. If this is not widely adopted, it will fail, with potentially disastrous consequences. As pressure grows to move to the next phase of response to the pandemic, transparency may be key to ensuring public acceptance of a national system of surveillance, of a little virus watcher in everyone’s pocket. In time, there may be more myths about this; let us hope they are positive ones, founded in the reality of data protection law.
Dr Rónán Kennedy is a Lecturer at NUI Galway School of Law. His research focuses on how law, information and communications technology interact, aiming to see 'through the hype' as a means of uncovering the real legal and societal consequences of advances in information technology. For further reading on this topic generally, see Internet Law (2020) by Michael O'Doherty - available in hard copy or as an e-book.