This post has been authored by Raghav Saha, a 3rd year student at Gujarat National Law University.
This post has been authored by Raghav Saha, a 3rd year student at Gujarat National Law University.
[This post is authored by Oshi Priya, a third-year student at the National Law University of Study and Research in Law, Ranchi.]
Education technology (EdTech) is the means to facilitate e-learning through the combination of software and computer hardware along with educational theory. Though still in its early stages of development, it’s a $700 million industry today in India and is headed for 8-10 times the growth in the next 5 years. Some of the popular EdTech companies in India include Unacademy, BYJU’S, and Toppr, etc.
Post-mortem privacy is defined as the right of a person to preserve and control what formulates his/her reputation after death. It is inherently linked with the idea of dignity after death. The first type of opinion with respect to post-mortem privacy raises the question of how there can be a threat to the reputation of a person if he no longer exists. However, there is another school of thought which argues that when a person’s public persona or reputation is harmed after death, he might not be defamed but the ante-mortem person could. Another question that comes up, is that when a person dies, does the interest of the dead person that survives become the interest of others or is it actually his interests alone that are protected or is it both the possible scenarios?
Private law justification of post-mortem privacy
There is an English principle called “Action personalis moritur cum persona”which means that a personal cause of action dies with the person implying a negative attitude towards death. However certain EU states following the civilian tradition have allowed protection of data of the deceased. Article 40(1) of the French Data Protection Act regulates the processing of data after an individual’s death. As per the article, individuals can give instructions to data controllers providing general or specific indications about retention, erasure and communication of personal data after their death.
Justice Edward Stuart in the case of Fairstar Heavy Transport N.V v. Adkins attempted to hypothesize a possible right to property over the contents of an email. This case dealt with a request of an employer to access content of emails on personal computer of his ex-employee relating to business affairs of his company. The question that came before the Queen’s bench was “Whether the claimant had any proprietary rights over content of the emails?”. This case held that the contents of an email cannot be subjected to proprietary rights and therefore the employer does not have an enforceable proprietary claim over the content of the e-mails. The court while trying to decide existence of a possible proprietary right over the contents came up with five possible methods of construing such proprietary rights. The first method would be that the title over the content of the email remains throughout with the creator or his principal. The second method would be that upon an email being sent title of the content would pass to the recipient (drawing from the analogy of vesting of title in passing of a letter according to the principles of transfer of property). The third method would be that the recipient of an email has a license to use the content of an email for any legitimate purpose consistent with the circumstance in which it was sent. The fourth method would be that the sender of the email has a license to retain the content and use it for any legitimate purposes and finally the last method would be that the title over the content of the email is shared between the sender and all the recipients in the chain. The court analysed the veracity of existence of each of these methods in construing a possible right to property over information.
The court held that the implication of adopting the first method would be that the creator of an email would be able to assert his title against the content of the world. The court opined that implication of this option would be strange and would have far-reaching impractical consequences. The court opined that if a possible title over the content of an email remains with the creator, then such vesting of title must allow the creator to use the very same title in all possible forms, which means it should also allow the creator to exercise the title by asking recipients down the chain to delete the content of the email. However, such exercise of the title is not feasible or practical, making this very option quite redundant. The court also rejected the second method. It rejected this method on the ground that if at any given point of time an email is forwarded to multiple recipients, the question of who had the title over its content at any given point of time would be extremely confusing. The third and fourth method mix the existence of proprietary right over the content of an email with nature of use of such information that is whether it’s use is for legitimate purposes or illegitimate purposes. The court held that the nature of use of information should not be an important consideration for exercising a proprietary right of control. The fifth option was also rejected on the ground of compelling impracticality.
The advent of digital will in India: future of data protection of deceased individuals?
If we look at the “Information Technology Act, 2000” then Section 1(4) of the IT Act,2000 read with the First Schedule of the IT Act provides that the IT Act is not applicable to a will defined under clause (h) of section 2 Indian Succession Act, 1925 including any other testamentary dispositions. If we look at digital wills in foreign jurisdictions, then the most talked about legislation would be the “Fiduciary Access to digital assets and Digital Accounts Act”. This piece of legislation is enacted by Delaware, which became the first state in the United States allowing executors of a digital will the same authority to take control of a digital asset. If we look at the 2016 Delaware Code, it basically revolves around the concept of ‘digital assets’ and the idea of ‘fiduciary’, as someone who could be trusted with the digital asset. The legislation defines “digital asset” as data, text, emails, audio, video, images, sounds, social media content, health care records , health insurance records, computer resource codes, computer programs and software, user names, passwords created, generated, sent, communicated, shared, received or stored by electronic means on a digital device.The legislation also defines a “fiduciary’ as a personal representative appointed by a registrar of wills or an agent under durable personal power of attorney. It provides that a fiduciary may exercise control over any and all rights in digital assets and digital accounts of an account holder to the extent permitted under state law or federal law.
Data Protection Bill
The Data Protection Bill, 2018 provides for the “right to be forgotten” under Section 27. It refers to the ability of individuals to limit, de-link, delete, or correct the disclosure of personal information on the internet that is misleading, embarrassing, irrelevant, or anachronistic. Now, upon an individual passing away, his sensitive personal data is up on line and if there is no regulation, his rights will be infringed as many times as the data fiduciary wants and the person does not have any remedy as the bill does not take into consideration the case of deceased individuals. The dynamic nature of data is such that it cannot be deleted on its own once the person is dead. The other provisions which are there for living individuals can be applied in cases of deceased individuals as well. UnderSection 10of the Personal Data Protection Bill, 2018, the data fiduciary can store data only for a limited period of time and can use the information only for the purpose it was taken for. If the data principle wants to amend any information or remove any information, he has the right to do so and the data fiduciary without any law cannot prohibit the person to do so. The current data protection regime fails to recognize and fulfil the needs for protection of digital rights. It is pertinent to consider whether the concept of a “digital asset” and “fiduciary” as present in Delaware legislation can be emulated in India. Protection of data post death involves questions of digital succession as well as intellectual property rights which is inheritable and this has to be taken into consideration while framing a legislation pertaining to post-mortem privacy. The number of internet users is estimated to be 566 million as of December 2018, registering an annual growth of 18%.Considering the growth of internet use in India, it is pertinent to have a proper legal framework for protection of data of deceased individuals.
In re Estate of Ellsworth, No. 2005-296, 651- DE (Mich. Prob. Ct. May 11, 2005).
Fairstar Heavy Transport NV v. Adkins.  EWHC 2952 (TCC).
The purpose of this series is to analyze the bare text of the Data Principal Rights espoused in the Bill (Chapter VI), namely the Right to Confirmation and Access, Right to Correction, Right to Data Portability and the Right to be Forgotten, in light of the text used in the European legislations to espouse the same values. Each post will deal with each of the above rights.
Part I of the series can be accessed here.
INTRODUCTION TO POST
Over the course of the ensuing section, I shall contrast the text of the Confirmation and Access provisions of the (PDPB) Personal Data Protection Bill (India) (S. 24) with the corresponding provisions of the (GDPR) General Data Protection Regulation (European Union) (Art. 15).
For the purposes of convenience, I have reproduced the relevant provisions below. (Emphasis supplied)
Personal Data Protection Bill (India)
“24. Right to confirmation and access. —
(1) The data principal shall have the right to obtain from the data fiduciary—
(a) confirmation whether the data fiduciary is processing or has processed personal data of the data principal;
(b) a brief summary of the personal data of the data principal being processed or that has been processed by the data fiduciary;
(c) a brief summary of processing activities undertaken by the data fiduciary with respect to the personal data of the data principal, including any information provided in the notice under section 8 in relation to such processing activities.
(2) The data fiduciary shall provide the information as required under this section to the data principal in a clear and concise manner that is easily comprehensible to a reasonable person.…
General Data Protection Regulation (European Union)
Right of access by the data subject
(a) the purposes of the processing;
(b) the categories of personal data concerned;
(c) the recipients or categories of recipient to whom the personal data have been or will be disclosed, in particular recipients in third countries or international organisations;
(d) where possible, the envisaged period for which the personal data will be stored, or, if not possible, the criteria used to determine that period;
(e) the existence of the right to request from the controller rectification or erasure of personal data or restriction of processing of personal data concerning the data subject or to object to such processing;
(f) the right to lodge a complaint with a supervisory authority;
(g) where the personal data are not collected from the data subject, any available information as to their source;
(h) the existence of automated decision-making, including profiling, referred to in Article 22(1) and (4) and, at least in those cases, meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject.
The right provides “data subjects”/ “data principals” (varying terms used by the GDPR and PDPB respectively for referring to natural persons to whom the data relates to) with the authority to demand from “controllers”/ “data fiduciaries” (varying terms used by the GDPR and PDPB respectively for referring to entities which determine the purpose and means of processing of data), dealing with the data subject’s personal data, certain information pertaining to the personal data. The right ensures that there exists lesser information asymmetry between those to whom the personal data pertains and those who are processing or controlling said data. Refer here for a summary.
At first glance, the Indian draft-legislation’s provision “Right to Confirmation and Access” (S. 24) might seem to be rather abstract and vague in comparison to its European counterpart, but closer inspection reveals that both are quite similar. While the GDPR provides guidelines within a mostly self-contained provision, the PDPB’s S. 24 cross-references S. 8, which contains the list of necessary information disclosure obligations placed on the “data fiduciary”.
Though there exists considerable degree of similarity, in text, between both the jurisdictions, certain distinctions in orientations are quite evident from the language of the provisions.
The Indian Bill, admirably, places explicit emphasis on the accessibility of disclosures. S. 24 (2) mandates that the disclosures be “easily comprehensible”. Wherever there exists a power imbalance, those with access to expertise and other resources are better placed to abuse the system through indulging in complex legalities. Such statutory protections reduce the likelihood of resource-rich (access to expertise & infrastructure) “fiduciaries” utilizing complexity to overwhelm citizens incapable of processing technical information.
Furthermore, the Indian draft-legislation requires a “brief summary” (necessarily disclosing the statutorily prescribed information), as opposed to its European counterpart, which doesn’t place any such requirement. The legislative intent behind the same seems to be consistent with the logic of accessibility (prevent provision of information that cannot be processed meaningfully) mentioned above.
Listing the specific data that needs to be disclosed could enable “fiduciaries” to utilize the provision as an avenue to avoid disclosure of other unlisted, but relevant information. I submit that an additional sub-section requiring disclosure of all relevant information over and above the statutorily mandated disclosures (a general overarching clause, in addition to the prescribed disclosure requirements) would have tilted the balance favourably towards data privacy.
Additionally, the Indian Bill doesn’t seem to be placing as much significance on profiling (processing of personal data for analyzing or predicting data subject’s behavior, characteristics, location, etc.; the GDPR’s Art. 4(4) and PDPB’s S 2 (33) define the term in varying detail but essentially, the definitions are of similar import) as its European counterpart. Though the PDPB refers to profiling and allied restrictions across the Bill, it lacks mention in Chapter VI (Data Principal Rights). Even upon analyzing the entirety of the documents, the EU legislation tends to be placing greater restrictions on profiling than PDPB. The Indian Bill, has instead, preferred allowing profiling subject to an assessment (S. 33: “Data Protection Impact Assessment”) carried out by the Data Protection Authority of India (established under Chapter X of the Bill).
Lastly, the European legislation (Art. 20(4)) clarifies that the request for information as a matter of right cannot be in abrogation of other’s “rights and freedoms”. Though S. 27(2) of the PDPB refers to balancing of rights in the context of “Right to Be Forgotten”, S. 24 doesn’t refer to any form of weighing of rights. Given that there could be numerous varied instances of legitimate conflicting rights, allowing the judiciary to decide on a case by case basis seems to point towards prudence.
Image taken from here.
The purpose of this series is to analyze the bare text of the Data Principal Rights espoused in the Bill (Chapter VI), namely the Right to Confirmation and Access, Right to Correction, Right to Data Portability and the Right to be Forgotten, in light of the text used in the European legislations to espouse the same values. Each post will deal with each of the above-mentioned rights.
INTRODUCTION TO POST
Over the course of the ensuing section, I shall contrast the text of the Data Portability related provision of the (PDPB) Personal Data Protection Bill (India) (S. 26) with the corresponding provision of the (GDPR) General Data Protection Regulation (European Union) (Art. 20).
For convenience, I have reproduced the relevant provisions below (emphasis supplied) and readers would be benefited from constantly referring to the bare text whenever necessary.
Personal Data Protection Bill (India)
“26. Right to Data Portability. —
(1) The data principal shall have the right to—
(a) receive the following personal data related to the data principal in a structured, commonly used and machine-readable format—
(i) which such data principal has provided to the data fiduciary;
(ii) which has been generated in the course of provision of services or use of goods by the data fiduciary; or
(iii) which forms part of any profile on the data principal, or which the data fiduciary has otherwise obtained.
(b) have the personal data referred to in clause (a) transferred to any other data fiduciary in the format referred to in that clause.
(2) Sub-section (1) shall only apply where the processing has been carried out through automated means, and shall not apply where—
(a) processing is necessary for functions of the State undersection 13;
(b) processing is in compliance of law as referred to in section 14; or
(c) compliance with the request in sub-section (1) would reveal a trade secret of any data fiduciary or would not be technically feasible. ”
General Data Protection Regulation (European Union)
Right to data portability
(a) the processing is based on consent pursuant to point (a) of Article 6(1) or point (a) of Article 9(2) or on a contract pursuant to point (b) of Article 6(1); and
(b) the processing is carried out by automated means.
Prior to analyzing the distinctions between the legislations, let us briefly understand the concept of data portability and its significance. Data portability requires entities that process data to provide collected personal data in a format that is interoperable across platforms. This prevents entities in control of personal data (e.g. Facebook) to hold one’s personal data hostage to a particular platform by way of avoiding to provide data in a format that can be used elsewhere (e.g. other social media websites).
Firstly, I submit that the exception of technical infeasibility (present in both the GDPR & PDPB) deserves criticism (refer double-underlined portion above). As Lawrence Lessig argued in Code 2.0, technological infeasibility shouldn’t be allowed to override a value system. In order to understand the argument further, let us (a) delve into the statute’s meaning and then (b) analyze the issues with technological infeasibility as an exception to the right.
Both the legislations conceptualize technological feasibility as an exception to the right. Conceptually, they place technology in a position that is conceptually superior to the right itself because the absence of technical feasibility would render the right nugatory. Now, let us move on to (b) analyzing the issues with technological infeasibility as an exception.
There exists a certain value system that our laws espouse. A value system, quite eponymously, is an aggregate of various values and ideals. Once a certain society decides to embrace a particular set of values and ideals i.e. a particular value system (whatever it may constitute), technology shouldn’t be allowed to hinder or steer the furtherance of the said value system. Allowing technological infeasibility to render a right redundant could lead to technological development being divorced from the embraced value system.
The question boils down to whether technology should be allowed to circumscribe the value system, or whether the value system should render the technology invalid? I argue, as did Lawrence Lessig, for the latter. Having the value system render technologies inconsistent with it invalid through law (e.g. by removing technical infeasibility as an exception) would force engineers to develop technologies that are consistent with the value system, which society has chosen. Therefore, such a model orients technological development in the direction of the value system that society has chosen for itself and cherishes, as opposed to a parallel value system. In other words, engineers should have the burden of structuring technology according to the ideals chosen by society, rather than the other way round where society adapts to the values and ideals furthered by the technology developed by engineers (a fraction of society).
Moving on, the EU legislation explicitly clarifies that the right doesn’t exist in abrogation of other’s rights and freedoms (refer underlined portion above). However, the Indian PDPB doesn’t provide any clarification regarding the enforcement of the “data portability” right of an individual vis-a-vis others’ rights and freedoms. Consequently, courts would have to make judgment calls as and when issues, involving a conflict between a “data principal’s” (natural person to whom the personal data relates to; the GDPR uses the term “data subject” instead) right to data portability and other’s rights or freedoms, arise.
Lastly, a difference can be noticed in the manner in which the scope of the right has been framed in either legislation (refer portion in bold above). The PDPB entitles subjects to receive and transfer their personal data solely where the data processing is “automated” (PDPB’s S. 2(7) defines “automated means” as “equipment capable of operation automatically in response to instructions given for the purpose of processing data”). Further narrowing the right’s scope, the PDPB provides exceptions, such as for protection of trade secrets, technical infeasibility, for being in compliance with another law and for furtherance of essential State functions. On the other hand, the GDPR provides the right in instances apart from automated processing, in addition to where the processing is based on consent (Art. 6(1)(a); “consent” defined under Art. 4(11)) and pre-contract obligations (Art. 6(1)(b); means according to a contract). Also, the GDPR provides a slightly different set of exceptions, namely, for technical infeasibility, public interest and exercise of the authority vested in the controller. Additionally, the GDPR provides that the right of data portability cannot prejudice the right to be forgotten (Art. 17). Therefore, it’s quite clear that the legislations are substantially different in demarcating the scope of the data portability right, but an analysis of greater rigour is required for ascertaining the actual ambit of the provisions.
Image taken from here.
The Data Protection Bill under Section 41 mandates any data fiduciary to store personal data of all data principals in India. It also requires companies process and store all critical personal data only in servers or data centers located in India. This requirement is colloquially known as ‘Data Localisation.’ The report justifies data localisation on several grounds such as easy enforcement, increase in compliance, reduction of foreign surveillance, among others. The following paper will discuss briefly the reasons provided by the Report, it will then critically evaluate the claims, and arguments made by the Committee. It will conclude by arguing against a requirement for data localisation.
Why did the Committee choose mandatory data localisation?
This section will be providing, at the cost of reiteration, the arguments presented by the Report. The Committee initiates its argumentation for data localisation on the ground that law enforcement agencies (LEA) require access to information for the detection of crimes as well as gathering evidence for prosecution. The presence of local copy of the personal data, according to the report, would allow for quicker and more efficient enforcement of laws in India. Presently, eight out of ten of the most accessed websites in India are based in the United States. However, none of these companies have offices in India. Acquiring data from any of these companies is a long and onerous process. The availability of the information is based on the presence of a bi-lateral or a Multi-lateral Agreement Treaty in this specific regard. The requests are passed through several agencies such as the court to the ministry of external affairs to the courts in the foreign jurisdiction before reaching the company. The according to the report is a highly bureaucratic process to access the information, if at all the requests are fulfilled.
The Report also argues that the this would reduce dependency on the fibre-optic undersea cable network this reducing the chances of being vulnerable to attacks. Holding critical information related to the nation inside the borders of the country is necessary for the healthy functioning of the country’s economy among others. The AI ecosystem that the NITI Aayog wishes to develop will receive a massive boost through this move. The growth of AI is directly linked to amount of data available within the jurisdiction, which will be necessary for the development of a local infrastructure. The creation of a digital infrastructure requires this move according to the Report.
It also argues that the chances of foreign surveillance of Indians also reduce. Post the Snowden revelations, the lack of any safeguards for the data of foreigners became clear since the United States legislations do not protect data of foreigners with companies storing their data there.
The Report argues that the cost of storing data in India may not be high, if not for large service providers but for small and medium size service providers. It argues that the costs, firstly, will be worth the spend because of the size of the Indian market and secondly, the cloud storage options available to smaller companies would increase if all data was stored in India.
Lastly, the Report argues that the fears regarding online censorship and chilling effect on free speech are entirely misplaced. It argues that there are other methods of the restricting free speech such as internet shutdowns. That for such restrictions to be possible, it has to be placed in a context of a dysfunctional data law coupled with government intention to use the same. That the images of a completely walled internet similar to China is a caricatured version presented of a post-data localisation web. That the internet in countries has always been shaped by the local context of that country.
What are the problems with mandatory data localisation?
While the arguments in relation to enforceability are well-taken, there are several problems, as this section will argue with regards to mandatory data localisation. While better enforcement of internet related offences is certainly a benefit. The overall benefits of the move however, do not justify the introduction of this move to India.
The argument that India’s dependence on fibre-optic cables will be reduced and allow it to function in times crisis does not hold good. While it is not argued that critical information regarding people’s medical data, financial data and biometrics are not shared outside. This information is necessary for the better functioning of the country especially in times of crisis. However, forcing companies to store local copies of the personal data of individuals does not serve the purpose at any level. This would not reduce the dependency on the cables, since the critical infrastructure being used to process, and compute the data will be available only at the company’s headquarters. The information stored in India will be completely futile.
The AI ecosystem is unlikely to be affected by a mandatory data localisation policy, if anything it could prevent newer companies from coming and testing their products in India. The data stored in India by companies will always be owned by them. The data stored by one company is not transferable to another merely by the virtue of the fact that the data is stored in India. The AI developed by any company will while collecting data be dependent on the proprietary software and hardware of that company or by getting into agreements with other companies to transfer the information to them. The AI will only learn on the basis of the data that is provided to them according to the economic capabilities and interests of the company. If the company wishes to enter the Indian market it will do so by either gathering data by itself or by buying information from the companies that have data on Indians. This transaction can be completed with or without a company necessarily having to store data about Indians in India.
Smaller firms will certainly have options of choosing many cloud storage services in India. However, this will be an additional cost on them regardless. The way smaller companies find foothold in market abroad is through organic marketing wherein they initially gain visibility in a market and then start developing a user-base. The company then chooses to develop or not develop products, if necessary for that market if it seems viable to operate in those markets. This is a pragmatic schema that companies would follow to ensure that their services reach the largest number of people without any additional burden on their operations. As an analogy, if a car manufacturer would have to set up an office in India if even one of its cars was being sold in India. They would simply avoid selling to Indians given cost incurred and the benefits incurred. This is likely to prevent any company from wanting to operate in India.
The argument made about foreign surveillance is flawed since it only looks at one side of the Snowden revelations. The revelations certainly showed that data stored in United States could be accessed at any point. However, the data stored in foreign jurisdictions was equally vulnerable according to the revelations. Lastly, the fears regarding the ability of the government in being able to censor content and the chilling speech are not misplaced. There may be several tools in the governmental arsenal to use and restrict discourse on the internet. However, this does not justify providing more tools to the government in restricting the speech. The enforcement of laws is going to be tougher through MLAT’s is likely to be tough, however, it is a short-sighted move. Especially, given the context in which the government has consistently and unreasonable used internet shutdowns, among other means to curb free speech.
The Internet as it was originally envisaged was developed was to ensure that there is a free-flow of information. This formed the basis of the entire architecture of the network in its initial days. Surely, the manner in which these systems have developed have changed the negotiation and manner of functioning of the internet. However, the primary infrastructure of connecting multiple people remains. This structure of the internet was such to ensure that it stays efficient even scaled and it was this that allowed the internet to grow to the extent that it has today. Forcing companies to store data in India is likely to disrupt this model and prevent any viable growth of the network organically.
Companies in western countries had a head start with regards to the usage of internet because it developed in those countries. Consequently, they were also able to develop better hardware and software to secure the data stored in their servers. This data simply cannot be transferred to any other place suddenly. The transfer of technology with respect to these may take years, not only in terms of legal and economic barriers. There are logistical barriers in setting-up such infrastructure in India. The economic costs itself will disincentivise companies from investing money in India. The Indian market may be large, but this is not enough for anyone to invest in developing such infrastructure in India. Companies may especially be reticent in moving to if the electrical, and technological infrastructure is not well-developed.
 Thomas Schultz, ‘Carving up the Internet: Jurisdiction, Legal Orders, and the Private/Public International Law Interface’ (2008) 19 Europe. J. Int’l L. 779
 X. Wu, X. Zhu, G. Wu and W. Ding, “Data mining with big data,” in IEEE Transactions on Knowledge and Data Engineering, vol. 26, no. 1, pp. 97-107, Jan. 2014.
 A. L. Buczak and E. Guven, “A Survey of Data Mining and Machine Learning Methods for Cyber Security Intrusion Detection,” in IEEE Communications Surveys & Tutorials, vol. 18, no. 2, pp. 1153-1176, Secondquarter 2016.
 Delivering Digital Infrastructure – Advancing the Internet Economy Report, April 2014, World Economic Forum.
 Helena Ursic; Bart Custers, Legal Barriers and Enablers to Big Data Reuse, 2 Eur. Data Prot. L. Rev. 209 (2016).
 Kritika Bhardwaj, Data localisation must go, it damages the global internet, https://www.hindustantimes.com/analysis/data-localisation-must-go-it-damages-the-global-internet/story-Aah1052ExFq6Ylcb9BQ4jJ.html, Hindustan Times, August 03, 2018
 Reema Shah, “Law Enforcement and Data Privacy – A Forward-Looking Approach” (2015) 125:2 Yale LJ 543.
 Hogan, Mél, and Tamara Shepherd. “Information Ownership and Materiality in an Age of Big Data Surveillance.” Journal of Information Policy 5 (2015): 6-31.
 T. Maurer, I. Skierka, R. Morgus and M. Hohmann, “Technological sovereignty: Missing the point?,” 2015 7th International Conference on Cyber Conflict: Architectures in Cyberspace, Tallinn, 2015, pp. 53-68.
 Gautam Bhatia, Free Speech Watch, https://indconlawphil.wordpress.com/free-speech-watch/, Indian Constitutional Law and Philosophy; Alexander Plaum, ‘The Impact of Forced Data Localisation on Fundamental Rights’ (Access now 4 June 2014) <https://www.accessnow.org/the-impact-of-forced-data-localisation-on-fundamental-rights/> accessed 15 Feb 2018.
 Monroe Price and Stefaan Verhulst, ‘The concept of self-regulation and the internet’ in J. Waltermann & M. Machill (Eds.), Protecting our children on the internet: Towards a new culture of responsibility (Bertelsmann Foundation Publishers 2000) <https://repository.upenn.edu/asc_papers/142/> accessed 15 Feb 2018.
 Fraser, E. (2016). Data localisation and the balkanisation of the internet. SCRIPTed: Journal of Law, Technology and Society 13(3), 359-373.
 Erica Fraser, Data Localisation and the Balkanisation of the Internet, (2016) 13:3 SCRIPTed 359 <https://script-ed.org/article/data-localisation-and-the-balkanisation-of-the-internet/> accessed 15 Feb 2018.
The first post in the series can be found here. Keep watching this space for more posts!]
Transparency and accountability in a government and its administration is an indispensable part of a participatory democracy. Information is the oxygen for the survival of a democracy. The Right to Information Act was passed in 2005 replacing the Freedom of Information Act, 2002 so that every citizen has the right to access information controlled by public authorities. RTI is intrinsic to good governance and a necessity for democratic functioning.
The Data Protection Bill and the Data Protection Report made by the B.N. Srikrishna Committee in 2018 has proposed amendments to the Right to Information Act. It proposes a change in section 8(1)(j) of the Act. The proposed amendment is –
“information which relates to personal data which is likely to cause harm to a data principal, where such harm outweighs the public interest in accessing such information having due regard to the common good of promoting transparency and accountability in the functioning of the public authority;
Provided, disclosure of information under this clause shall be notwithstanding anything contained in the Personal Data Protection Act, 2018; Provided further, that the information, which cannot be denied to the Parliament or a State Legislature shall not be denied to any person.”
The reason behind such change as mentioned in the report is that the current version of section 8(1)(j) does not indicate what would constitute an unwarranted invasion of privacy and that the amendment would solve this problem. The report states that – “A lot of information sought from a public authority may contain personal data of some kind or another. Further a strict interpretation of purpose limitation may give rise to the inference that any disclosure other than for the purpose for which the personal data was submitted would lead to an unwarranted invasion of privacy.” According to the report, information should not be disclosed in ‘exceptional circumstances,’ where the likelihood of harm from the disclosure outweighs the common good of transparency and accountability in functioning of public authorities. The report mentions that by the proposed amendment a more precise balance would be created between Right to Information and Right to Privacy.
However, it is submitted that the amendment are problematic for a number of reasons which are enlisted below –
If these tests are satisfied then the information has to be disclosed. It is important to understand that the degree of right to privacy available to public officials when performing public duty is lower than the right to privacy available to a private person in general circumstances. Therefore, right to information has to be more important in the specific context of public officials when doing public duty than right to privacy. However, when we read the amendments proposed, they clearly elevated the right to privacy to a higher pedestal over the right to information, which runs antecedent to the decision of the Supreme Court. Therefore, the amendments are not to be implemented. Apart from the fact, that the amendment does not adhere to constitutional principles as discussed above, this in itself is very problematic which is discussed below.
This section by its virtue has enough safeguards within itself to prevent any unnecessary information to be disclosed that could infringe upon the privacy of an individual and meet the standards of privacy that have recently developed, if used properly. However, this section has been misused and has been weaponised against the very idea of having the RTI act, which was to curb corruption and create transparency. This is said because of the Supreme Court judgement in Girish Ramachandra Deshpande vs CIC, wherein the court ignored this proviso and precedents and laid down that assets and details of the public servant constituted as personal information. This case has had a chilling effect on several cases like R.K Jain vs Union of India and Canara Bank vs CS Shyam  to name a few. So, the RTI Act which was capable enough to balance public officials right to information and to right to privacy got distorted due to the Girish Ramchandra Deshpande case resulting into the dilution of the RTI Act and diminishing its purpose of creation.
Firstly, the phrase “relates to personal data” makes the ambit of information that can fall within this to be extremely wide and vague. The Draft Bill does not define ‘personal data,’As a result, anything and everything can be included as relating to personal data. The consequence of this will be that every time information is sought about a public official, this section will come into application and information could be withheld from disclosure. Without any violation of privacy, information would be withheld simply because the amendment uses as broad a phrase as ‘personal data’ without defining it. This non-disclosure of the information without any infringement of privacy will result into a violation of the right to know which is enshrined in Article 19(1)(a) of the Constitution.
Secondly, the standard has been set to “likely to cause” which is an extremely low standard. The mere possibility that the disclosure of information may cause harm is too low a standard to follow. It will deter PIOs from disclosing any information because there might be a remote possibility of causing harm. They would be over cautious of disclosing information and will be reluctant to disclose any information. A standard so low cannot be followed for an act like RTI which is so fundamental to bringing transparency and accountability in the functioning of the government. Using of such a low standard will deny the right to information to citizens.
Thirdly, the word “harm” is very broadly defined by the draft bill in its section 3(21). It includes ‘bodily injury’, ‘mental injury’, ‘loss of reputation’ among other things. Inz practicality, information exposing corruption activities will definitely bring a loss of reputation to the concerned person. Making this as a ground to not expose the corrupt official makes no logical sense. Also, ‘mental injury’ cannot be a ground to not expose a corrupt official. These grounds have no rational basis. It is equivalent to saying that someone who commits a crime should not be punished because it would be affect him physically and mentally and lower his reputation in the society.
Hence, the amendment proposed should not be accepted as it would completely water down the RTI Act,2005 and render it ineffective.
4) On one hand, the proposed amendment sets the threshold of “likely to cause harm to a data principal, where such harm outweighs the public interest” which must be adhered to when determining whether there should be disclosure. However, the proposed amendment also retains the proviso to the old Section 8(1)(j) of the RTI Act which says that “information, which cannot be denied to the Parliament or a State Legislature shall not be denied to any person.” Hence, while the first part of the proposed amendment proposes to set a different threshold as compared to the previous Section 8(1)(j), the “acid test” of the proviso retained is that of 8(1)(j) itself. Hence, the proposed amendment is contradictory in itself, there being two different thresholds present in it – one threshold is borrowed from the old Section 8(1)(j) and the other new threshold introduced.
Further, the PIOs under the RTI Act are not judicially trained and practically speaking, it is extremely difficult for a ground – level PIO to understand the complex concept of privacy and the jurisprudence associated with it. Keeping this in mind, the contradictory thresholds set by the proposed amendment further complicate the already – complex process of interpretation for the PIOs. This complication will lead to unintended consequences wherein a PIO may disclose information that was intended to be denied and deny information that was intended to be disclosed. Thus, it is proposed that the different thresholds will defeat the purpose of the amendment itself and the amendment must be corrected to that extent.
In keeping with this pro – disclosure jurisprudence is the recent case of Lok Prahari v. Union of India, wherein the court ruled in favour of asset – disclosure of election candidates saying that, “If assets of a Legislator or his/her associates increase without bearing any relationship to their known sources of income, the only logical inference that can be drawn is that there is some abuse of the Legislator’s Constitutional Office. Something which should be fundamentally unacceptable in any civilized society and antithetical to a constitutional Government. It is a phenomenon inconsistent with the principle of the Rule of Law and a universally accepted Code of Conduct expected of the holder of a public office in a Constitutional democracy.” (emphasis added)
It is thus seen that the court has consistently upheld the disclosure of assets of not only election candidates but also their associates. Obviously, if citizens have a right to know about the assets of those who want to become public servants, the threshold of their right to get information about those who are already public servants cannot be lesser. Keeping this is mind, the proposed amendment goes against the established threshold regarding asset disclosure and thus it is proposed that it must be modified to take into account firmly established jurisprudence.
One of the exceptions listed under Article 19(2) is “defamation”. “Defamation” includes certain defenses like truth, fair comment, privilege etc.
In this light, it is important to note that the definition of “harm” as per Section 3(21) of the Draft Bill (which is also to be used in the proposed amendment to the RTI Act) includes “loss of reputation” and not “defamation”. “Loss of reputation” is much broader than “defamation” simply because the defenses that apply to defamation do not apply to it. Thus, the exception of imposed by the proposed amendment is broader than the exception set out under Article 19(2) and is, to that extent, unconstitutional.
Hence, it is proposed that for the purposes of the RTI Act under which disclosure of information can be denied only under Article 19(2), the exception of “loss of reputation” under “harm” should be changed to “defamation” in line Article 19(2).
Therefore, for all the reasons mentioned above, which are that the Bill does not harmonize the two rights, problems with the words and phrases in the amendment, internal conflicts within the amendment, the proposed amendment being on a lower standard than set by the Supreme Court in right to information and public disclosure cases, and its scope wrongfully extending beyond the restrictions mentioned under article 19(2). It is proposed that the RTI Act should not be amended and Section 8(1)(j) should remain as it is presently.
However, it is important to mention on a broader note that contrary to the general perception, the Right to Privacy and the Right to Information are complementary and not contradictory to one another and must be presented as being so in the future in keeping with the Constitution and for the good of all the people in the country.
The next post can be found here.
 Thalapallam Ser. Coop. Bank Ltd. v. State of Kerala, (2013) 16 SCC 82
 Supreme Court of India vs Subhas Chandra Agarwal, (2011) 1 SCC 496.
Girish Ramachandra Deshpande vs CIC, 2012 (119) AIC 105 (SC).
 R.K. Jain Vs. Union of India, 2013 (10) SC 430.
 Canara Bank vs CS Shyam, 2007 (58) AIC Ker 667.
 State of UP vs Raj Narain, 1975 AIR 865.
 Union of India v. Association for Democratic Reforms & Another, 2002 (5) SCC 294.
 People’s Union for Civil Liberties and Another vs Union of India and Another ,2003 (4) SCC 9.
 Lok Prahari vs Union of India, AIR 2018 SC 1041.
The first post in the series an be found here. Keep watching this space for more posts!]
Immediately, the Personal Data Protection Bill (hereinafter known as ‘the Bill’) makes it clear as to whom its provisions will affect. Section 2 of the Bill states that it will apply to processing of any personal data by the State, any Indian company, or any Indian citizen or other person incorporated under Indian law.
Therefore, it is evident that there exists a vertical application of the bill.
Vertical Application of the Bill
Section 3 of the Bill defines ‘data fiduciary’, which includes both the State and any person or company. State is defined as per Article 12 of the Constitution. The Bill states, under Section 4, that data processing has to be done in a ‘fair and reasonable’ manner that respects the privacy of the data principal. Naturally, an important element in this is consent, which is elucidated upon in Chapter III of the Bill. Section 12 gives a list of conditions that must be fulfilled in order for consent of the data principal to be valid. However, the Bill exempts the State from the obligation of taking consent from the data principal in certain situations, which are enumerated below.
Section 13 of the Bill states that personal data may be processed if it is necessary for the function of the parliament or any State legislature, or for a function of the state that provides a benefit to the data principal from the State, or certifies, licenses or permits any action of the data principal by the State. Section 19, likewise, gives an exception for consent for sensitive personal data for the same reasons. Section 17 allows for non-consensual data processing for any ‘public interest’ or for ‘prevention or detection of any unlawful activity’ Section 42 covers processing of data for the security of the State, and it says that if such processing is necessary and proportionate to the interests achieved, the processing is exempt from several key aspects of the bill, such as consent of data principal, rights of data principals and transparency and accountability, the only caveat being that there must be a law and procedure must have been established.
While it is obviously favorable for the Bill to make clear when the State is legally obligated to take data from individuals, it does not do enough to define in what situations they can exercise their power, which raise concerns of the possibility of misuse due to the blanket exception for consent of the data principal. The usage of the words ‘necessary’ and ‘strictly necessary’ in the Bill does not offer much in terms of assurance as to when the State can bypass the conditions laid out in Section 12. Firstly, there is no concrete definition of either of the words, which make it unclear as to what kind of situation would allow for the State to process personal data. Furthermore, there no clear distinction made between what constitutes as ‘necessary’ and ‘strictly necessary’, which makes no sense as the standard for processing sensitive personal data has to be substantially higher than for processing personal data that is not sensitive. The Bill defines ‘sensitive personal data’ under Section 3 as data that relates to a data principal’s passwords, sex life, genetic data, caste or tribe, religious beliefs, etc., which merits a much higher standard of protection and the situation must completely demand the processing of this data in order for it to be legitimate. As a general principle, any personal data should only be processed with the express consent of the individual. Therefore, it is imperative for there to be some broad definition as to what constitutes a ‘necessary’ situation in order for there to be a clear framework on when the State can take data. However, this is lacking in the current Data Protection Bill. The current usage of the word points to vagueness and allows for arbitrary exercise of authority. Experts have commented saying that tighter provisions that would not dilute rights of data principals would be a welcome addition to the Bill.
The test of necessity is not a new concept when it comes to data collection. It is an essential requirement with which any proposed measure of data collection must comply. It is an essential factor in assessing lawfulness of processing of data .In Article 52 of the Charter of Fundamental Rights of the European Union, which lists conditions that limitations on protection of personal data must adhere to, the necessity of such a limitation is an important factor. Necessity is defined as ‘the need for a combined, fact based assessment of the effectiveness of the measure for the objective pursued and of whether it is less intrusive compared to other options for achieving the same goal’ while ‘strict necessity’ is observed through situations where violations of fundamental rights occur. In this jurisdiction, the test itself is a four-pronged checklist that examines the limitations a proposed measure puts on rights vis a vis the objective of the measure in question. Such a definition of necessity is absent in the Data Protection Bill, and it should be incorporated in order to promote transparency in how the State collects data.
Data should only be collected by the State for performance of regulatory functions or functions which would be intrinsically linked to a form of governance. Having a wide ambit for non-consensual collection of data would defeat the purpose of such a Bill and would result in potential misuse of sensitive data by the State. An obvious instance of the State taking data from individuals is the Aadhar scheme. At the moment, Aadhar is a part of many of the benefits that we as citizens are entitled to: health care benefits, SIM cards, IT returns, etc. In recent times, PayTM even made it mandatory for linkage of Aadhar to its databases in order for continuous usage. The aforementioned sections provide the government with the blanket exception for collection and processing of personal data. In the absence of a clear definition of necessity, the possibility for widespread collection of private data is apparent. The simultaneous existence of the seemingly never-ending reach of Aadhar and the regulations of ‘fair and reasonable’ data procession is something that the Bill must consolidate and resolve by bringing about amendments to ensure that people are aware of how their data can be used by the government.
The next post can be found here.
 Section 2, Personal Data Protection Bill 2018.
 Section 4, Personal Data Protection Bill, 2018.
 Section 13, Personal Data Protection Bill, 2018.
 Section 17, Personal Data Protection Bill, 2018.
 Section 42, Personal Data Protection Bill, 2018.
 Section 3, Personal Data Protection Bill 2018.
 Shaikh Zoaib Saleem, 3 things to know about the new draft law on data privacy, livemint, https://www.livemint.com/Money/DiBBSl9e4ybGBI5Me0bWEI/3-things-to-know-about-the-new-draft-law-on-privacy.html
 European Data Protection Supervisor, Assessing the necessity of measures that limit the fundamental right to the protection of personal data: A Toolkit (available at https://edps.europa.eu/sites/edp/files/publication/17-06-01_necessity_toolkit_final_en_0.pdf)
 Necessity & Proportionality, https://edps.europa.eu/data-protection/our-work/subjects/necessity-proportionality_en
 Praavita, Can the Aadhar Act and a Data Protection Act Coexist?, The Wire, https://thewire.in/law/can-the-aadhaar-act-and-a-data-protection-act-coexist
 Section 4, Personal Data Protection Bill 2018.
The first post in the series can be found here. Keep watching this space for more posts in the series!]
With the Supreme Court upholding the constitutional validity of the Aadhaar Act and scheme on the 27th of September, 2018, a significant impact will be felt by the Data Protection Bill. If one looks at the larger aim of a Bill like the Data Protection Bill, it is to recognize that an individual’s data and their rights over it are of utmost importance. With the Apex Court upholding the validity of Aadhaar albeit certain caveats, a thorn is created in the larger realization of the Bill’s goal. Principally, the limitation of the role of Aadhaar by the judgment would secure rights in terms of who uses available data and the interference of private parties. However, the fact that biometric data collection is still a valid process creates doubts regarding the conflicting nature of the aims of data protection and Aadhaar.
The sheer amount of private and confidential data amassed in one singular database has given rise to concerns over data security and its privacy. Many critics have pointed out that the use of biometric data instead of smart cards is a mechanism of choosing surveillance over the use of e-governance technologies.
1. Consent, AADHAR and Data Protection
The idea of consent does not present itself when a data subject is mandatorily required to register themselves with the Aadhaar programme. The Supreme Court held that Aadhaar is essential for filing Income Tax Returns (ITR) and to obtain a new PAN Card. Accountants in Nottingham exclaimed that the recent judgment makes linking of Aadhaar to PAN also mandatory which again takes away the idea of choice in giving out information that concerns personal data. Thus while in theory the programme remains voluntary, in practice it simply is not, as most services are linked to the PAN Card, including crucially opening a bank account.
Especially with reference to the provision of subsidies and benefits, Aadhaar has become ‘the’ identification metric. Failure of Aadhaar authentication has resulted in the loss of the subsidy or the benefit. The government has refused to use in other forms of identification as an alternative for the same. Therefore, the idea of consent embodied under Section 12 of the Draft Bill is violated. Even if on a central level Aadhaar is made non-mandatory for the provision of certain services, there are many State-level provisions that are necessarily linked to solely the Aadhaar – most painfully sometimes in denying education to students.
2. The AADHAR infrastructure and purpose of limitation
Section 5 of the Data Protection Bill is the ‘purpose limitation’ clause. Section 5(1) states that ‘personal data shall be processed only for purposes that are clear, specific and unlawful’. A very obvious counter to this is presented in the form of Aadhaar. The nexus that the Government draws upon to justify Aadhaar is the linking of it to subsidy and welfare benefit schemes. While Aadhaar has become mandatory for the same, there is no limitation as to what extent the purpose can be determined until which it is legitimate for making Aadhaar mandatory. The creation of an Aadhaar number associated with an individual is itself the individual giving up on certain rights that concern their biometric data and physical markings. Even if the Aadhaar is made for the singular purpose of accruing social welfare benefits, the fact that every new scheme may seek the same makes the idea of purpose determination difficult if not impossible. The scope available to the Government for drawing out information under the guise of the Aadhaar is notably expanded.
The Aadhaar Act will have to be amended in order to ensure the autonomy of the UIDAI.
The Aadhaar project engages in a balancing exercise between the individual’s right to privacy and the state’s right to intrude upon that privacy but ultimately comes out heavily in favor of the latter. While the idea of a data protection Act appears to be based upon ensuring a fair and meaningful exercise of the right to privacy, this cannot be achieved unless the unjustifiable privacy incursions of Aadhaar are adequately dealt with. The Bill includes several exceptions to the requirement of consent for the processing of data, some of which pertain, inter alia, to the provision of welfare benefits and not merely state security exemptions (Section 42) or prosecution of offenses. This would bolster the functioning of Aadhaar to such an extent as to abrogate a (vulnerable) data subject’s expectation of privacy.
Sections 13 and 19 of the draft Bill are particularly relevant in this regard. While Section 13 allows for the processing of personal data even without consent for the exercise of “any function, for the delivery of services or benefits or issuance of certificates”, Section 19(b) in a similar vein allows for the processing of sensitive personal data (which includes biometric data) if it is “strictly necessary for… any function of the State authorized by law for the provision. The use of such broad and sweeping terms is reminiscent of the broad and sweeping ideals of any service or benefit to the data principal”. Similarly, Section 17 allows the Data Protection Authority (DPA) to process data for “reasonable purposes”, which as per the accompanying illustrative list includes such uses as credit scoring and debt recovery which could be easily taken from the Aadhaar database which, even after the judgment, intrude into multiple areas of everyday life. Hence, it is always advisable to know what must you must know before filing for bankruptcy as it can help you to overcome from being an insolvent. These are some of the things out of what should I know about bankruptcy. This merely strengthens a DPA that is already tasked with far too excessive levels of powers. By providing this increased scope for data interference and exceptions from being governed from the personal right to privacy, there is an increased scope of arbitrary action. Even in the presence of remedies to the same, there will still inevitably be a number of data privacy casualties as a product of this nearly unlimited power.
The key question to be answered in this regard is whether Aadhaar is, in practice, necessary to carry out the function of the State, and this remains extremely contentious (particularly in light of the purpose limitations laid out in Section 5 of the draft Bill). In light of the fact that notifications of breach of data are to be made only in the likelihood of ‘harm’ being caused to the data principal as under Section 32, this is even more troubling.
The draft Bill also states that personal and sensitive personal data can be processed if in accordance with an explicitly mandated Indian law, and this clearly justifies the Aadhaar in its entirety now that the court has validated its existence. Alarmingly, Section 45 does not discuss the requirement of consent when it comes to the large-scale use of data for research or archival purposes (seen to be a ‘national treasure’), which clearly gives further credence to the idea of a project premised upon mandatory collection of personal data.
These exceptions provide greater scope for surveillance, an issue the Bill remained silent on with regards to the Aadhaar.
The draft Bill appears to have strengthened the status of the UIDAI particularly in relation to matters of dispute settlement, by placing the burden upon the data fiduciary i.e. the UIDAI to approach the courts. While the Committee report recognizes the need to ensure the autonomy of the UIDAI, adjudicatory power has been proposed to be granted to the UIDAI (in addition to the power of other Adjudicatory Officers) and at the same time, the exclusivity of allowing the UIDAI to file complaints has been maintained. This only strengthens the legitimacy of privacy incursions by the UIDAI by allowing it to effectively have discretion over claims of data breaches.
The next post can be found here.
[Ed Note : The following series of posts contain comments on the Srikrishna Committee Report and the Draft Data Protection Bill, 2018 made and compiled by students from NALSAR University of Law -Ankush Rai, Ashwin Murthy, Arvind Pennathur, Namratha Murugesan, Priyamvadha Shivaji, Shweta Rao, Sriram Kashyap, Vishal Rakhecha and Tanvi Apte. The comments have been uploaded on the Ministry of Electronics and Information Technology (MeitY) website.
The present post deals with comments made in relation to four issues that arise in relation to the Report and Draft Bill – a) vagueness, b) government interference, c) the data protection authority and d) surveillance.