[This post has been authored by Noyanika Batta, a Senior Associate at Lakshmikumaran & Sridharan Attorneys. She is a 2018 graduate from Gujarat National Law University.]
There exist dichotomous views on the usefulness of surveillance and its relationship with public health. The disease control strategies adopted by the states often necessitate extensive surveillance practices having an overbearing and intrusive effect on the daily lives of its citizens. The debate thus lies in striking the right balance between public health and the need to strengthen public health infrastructures vis-a-vis privacy protection for individual citizens. With the rapid spread of COVID19 debilitating economies and causing health systems across the globe to crumble, it became imperative for governments and organizations to take immediate actions to protect its people. This in turn saw a fierce boom in surveillance technologies dedicated towards monitoring whole populations, with governments trying to chart the virus’ trajectory from broad swathes of personal data. This article seeks to examine the disproportionate risks to data privacy caused by the use of invasive and pervasive technologies such as contact tracing across the world.
Contact Tracing and its transition to a digital world
Contact Tracing essentially refers to a technique used by public health authorities to map out who an infected person has been in contact with to minimize disease spread. It involves three steps: (1) identifying people who came in contact with the infected person, (2) locating and notifying those people about their exposure, and (3) regularly following up with them to monitor for infection. Unlike social distancing, which has not been used at such a wide scale since the influenza pandemic of 1918, contact tracing has been a staple of infectious disease control since the 1920s. It has been used to contain the spread of diseases, including Ebola in 2014 and the HIV/AIDS epidemic.
Traditionally, contact tracing followed a manual approach. However, with the increasing number of cases in the context of Covid-19, the process became complicated and time-consuming very quickly. To cut down on the man-power requirement for these processes, an array of online tools (smartphone apps) surfaced across the world. Thus, while the concept in itself is not new, it is the sudden rush to embrace digital/electronic contact tracing that has opened a Pandora’s box of privacy and security woes. The biggest privacy concerns with the rapid use of contact tracing is that these applications could expose personal data to third parties, reveal behavioral patterns, aid governmental agencies to spy on its citizens or be used for mass surveillance purposes. Thus, it is imperative to prescribe standards to limit the use of the data gathered through these smartphone applications. These standards must provide for some clarity on the purposes for which the data could be used in future, who can access the data, the time period for which the data is to be stored and the treatment of data upon expiry of the emergency period for its use.
Current Landscape: Analyzing some of the prominent contact tracing apps
As of today, as per MIT Technology Review’s database of contact tracing smartphone apps, 50 countries have introduced such contact tracing apps. Out of these apps, about 36 of them are built on Bluetooth proximity tracing technology (rather than more precise location data) which claim to be privacy preserving. In this section, we shall discuss about some of the most prominent contact tracing applications that have been launched worldwide:
India: The Government of India through the Ministry of Electronics and Information Technology
(MeitY) launched Aarogya Setu, a contact-tracing mobile application. The application collects demographic data comprising of: phone number, name, age, sex, profession and countries visited in the last 28-45 days which is stored on the app’s central server. Further, where a person is confirmed to be positive or is at risk, a thirty day log of their contacts gets automatically uploaded on a centralized server. Despite the app being highly controversial for its privacy violating design, its use was made compulsory for rail and air travel. Concerns have been raised about private sector participation in the development of the app creating ambiguity about its design decisions. With such lack of legal and policy safeguards in place, it is feared that insights generated using the user’s data shall remain with the government.
China: People in China have signed up for city-based contact-tracking systems being provided primarily by Alipay and WeChat by submitting their personal information such as national identity number or passport number, recent travel and health status. Based on this, the software of these apps then uses this data to assign a colour code- green, yellow and red signifying the user’s risk of infection, and thereby their freedom of movement. The system is highly intrusive in nature and there are evidences that these applications feed data back to the authorities, which could be used for mass surveillance. This has been exemplified by the brazen violation of privacy in the case of Hangzhou. The officials are exploring using the health code data from these apps to rank citizens on a health index based on their hours of sleep, number of steps taken, frequency of smoking and drinking etc.
Singapore: In Singapore, two technologies have primarily been used to curb the virus, namely- ‘Safe Entry’ and the official government application, ‘Trace Together’. ‘Safe Entry’ is applicable to all entries in public places and requires the citizens to scan a code and to enter their name, ID, passport number and phone number. ‘Trace Together’ on the other hand utilizes Bluetooth signals (without deploying location data) between mobile phones to identify the proximity between coronavirus carriers and others. Each device periodically generates a random unique identifier for communication with nearby devices. Despite both the apps claiming to be privacy friendly, recent reports confirming police access to their data have raised serious concerns about possible governmental abuse of contact tracing systems.
USA: It did not opt for a federal level contact tracing framework. Individual states have introduced multiple contact tracing applications in the US, like ‘Covid Watch ’and ‘CoEpi’. Contact tracing apps have however been a huge failure in the US due to the widening trust deficit between the people and the government/tech companies. To bridge the gap, Apple and Google came up with revised models of the apps with built-in privacy features such as Bluetooth enabled location tracking features, opt-in choices, anonymization of data, decentralized sets i.e. storage of data only on a user’s device and use limitations. This is good from an individual’s privacy perspective, however the effectiveness of these apps in precise location tracking is marred by limitations.
UK: The National Health Service (NHS) launched its contact–tracing smartphone app for England and Wales in September last year. The app is based on the decentralised exposure notification technology developed by Apple and Google.
Israel: Israel has also made a noticeable contribution to pandemic surveillance, opting for a rather aggressive move by relying on its domestic intelligence agency in dealing with COVID-19, thereby translating it into an issue of national security. Little is known about the technology, except that the Shin Bet security service has been using this tool for two decades sweeping metadata from anyone who uses telecom services in Israel to track militants and halt attacks. Once the Health Ministry provides the name, ID number and cellphone number of individuals who have tested positive for Covid, Shin Bet identifies all those who came in contact with the patient using its classified database. The app was however banned, forcing the Israeli Health Ministry to launch ‘HaMagen’, a relatively more privacy-friendly decentralized GPS/location-based application.
It is therefore evident that for most countries the mechanism for the processing, use and storage of personal data lacks transparency, with the potential for abuse of personal data in the future. Without adequate safeguards in place, there is no assurance that such sensitive data won’t end up being used for electronic surveillance by nation states over their citizens in future or be passed on to third parties.
Checklist of parameters for determining the level of privacy preservation
While, it is true that digital contact tracing cannot be used in a vacuum and has proven to be of paramount importance in dealing with the pandemic, we must ensure that key elements of privacy aren’t abused or disregarded. The key is framing privacy policies that enable data privacy and data sharing to coexist and also outline the possible risks. With the legal basis for sharing personal information in contact tracing being founded on consent, it is essential that we move towards a more transparent and secure setup, comprising of some of the basic features listed below. It may be noted that the guidelines listed in this section are general in nature and should not be treated as an exhaustive set.
- Increased improvement in the readability of terms of agreement of privacy policies: While, such contact tracing apps provide users the access to the terms and conditions of their privacy policies, most of them are very difficult to read and comprehend. For the users to be able to effectively police the data gathered by these apps, they must have access to readable terms of agreement of privacy policies. This should be coupled with a robust and swift complaining mechanism to address their grievances.
- Transparency founded on the principles of active consent: The users should be given a clear opportunity to agree to the specified use of their personal information. All secondary non-obvious uses of the personal information should be laid out transparently to enable users to make an informed choice.
- Decentralized storage of data: The debate over centralized vs decentralized storage of data boils down to the effectiveness vs potential future risks argument. In a decentralized setup, unique codes created upon a contact event are recorded on the user’s device and not uploaded on a central server. Data processing gets triggered only upon an individual getting infected. By creating smaller clusters of data, this model offers greater protection from large-scale malicious activities and governmental abuses. Hence, with most countries adopting the Bluetooth based decentralized model, it is a step in the right direction towards ensuring that users get to have greater control over their information.
- Data Anonymization: Identifiers must be generated using state-of-the-art cryptographic processes and must be renewed on a regular basis to reduce the risk of physical tracking and linkage attacks.
- Data Minimization: All contact-tracing apps must adhere to the principle of data minimization. Thedata processed must be reduced to the strict minimum. The application should not collect unrelated or information that is not required for its primary purpose, such as civil status, communication identifiers, equipment directory items, messages, call logs, location data, device identifiers etc.
- Voluntary opt in policy: The apps should follow a voluntary opt in policy i.e. no one should be compelled to install a contract tracing app or to share messages indicating their status.
- Destruction of data: Strict limits should be imposed on data storage prescribing the time limit for storing personal data. Data must not be kept longer than is absolutely necessary, taking into account medical relevance as well as administrative requirements.
Conclusion
The pre-requisite for any contact tracing app to be effective in containing the pandemic is that it must be used by a sufficient portion of the population. This can only be done if the app has adequate safeguards in place to inspire the trust of the people. By analysing the privacy controls of some of the popular applications in operation, the author has attempted to highlight the failure of these apps to provide tangible incentives to users and suggested a few strict data privacy guarantees to bridge the trust deficit.