Skip to content

Tech Law Forum @ NALSAR

A student-run group at NALSAR University of Law

Menu
  • Home
  • Newsletter Archives
  • Blog Series
  • Editors’ Picks
  • Write for us!
  • About Us
Menu

Category: Privacy

Continued Use of Section 66A of the Information Technology Act 2000

Posted on June 12, 2019 by Tanvi Apte

The “Existence” of a Non-Existent Law and the Broader Issues it Raises

The Information Technology Act 2000 (hereinafter referred to as the “IT Act”), India’s nodal law on regulation of information technology, was significantly amended in 2008 in order to plug certain loopholes in the original Act as well as accommodate further technological development within its legal framework. Among other things, this 2008 amendment to the Act introduced Section 66A, which essentially made sharing of “grossly offensive”, “insulting” or “menacing” information (Read: criticism of political parties) through electronic media a criminal offence.

Read more

Conundrum of Right to Be Forgotten: An Analysis of The Slippery Slope: To Forgive, Forget or Re-Write History

Posted on May 5, 2019May 5, 2019 by Tech Law Forum @ NALSAR

[Ed Note : In a slightly longer read, Pranay Bhattacharya, a second year student of Maharashtra National Law University (MNLU) Aurangabad talks about the origins and development of the “Right to be Forgotten,”, using this as a background to critically analyze this right as present in India’s Draft Personal Data Protection Bill 2018.]

“Blessed are the forgetful, for they get the better even of their blunders.”

Read more

Do not ‘Offend, Shock, or Disturb’: Destroying the Raison d’être of Free Speech

Posted on May 3, 2019 by Tech Law Forum @ NALSAR

[Ed Note : In a post that has previously been published here, Hardik Subedi of NALSAR University of Law offers a scathing critique of Nepal’s New Information Technology Bill. Do read to find out more!]

“They claim that they have brought in democracy overthrowing monarchy,

Read more

The Dark Web : To Regulate Or Not Regulate, That Is The Question.

Posted on December 29, 2018December 29, 2018 by Shweta Rao

[Ed Note : In an interesting read, Shweta Rao of NALSAR University of Law brings us upto speed on the debate regarding regulation of the mysterious “dark web” and provides us with a possible way to proceed as far as this hidden part of the web is concerned. ]

Human Traffickers, Whistleblowers, Pedophiles, Journalists and Lonely-Hearts Chat-room participants all find a home on the Dark Web, the underbelly of the World Wide Web that is inaccessible to the ordinary netizen.  The Dark Web is a small fraction of the Deep Web, a term it is often confused with, but the distinction between the two is important.

Read more

Dr. Usha Ramanathan’s Talk on the UIDAI Litigation

Posted on December 24, 2018December 4, 2020 by Tech Law Forum @ NALSAR

[Ed Note : The following post is based on Dr. Ramanathan’s enlightening talk  at the NALSAR University of Law, Hyderabad. It has been authored by Karthik Subramaniam and Yashasvi Raj, first year students of the aforementioned university, who,  in a slightly longer but informative read aptly put forth Dr. Ramanathan’s views on the Aadhar issue and its judicial journey.

Dr. Usha Ramanathan, an internationally recognized legal expert, is currently research fellow at the Centre for the Study of Developing Societies and professor at the Indian Law Institute. Since 2009, she has consistently brought forth the loopholes in the Aadhar project, exposing its shoddy functioning.]

Read more

Bare Text Comparison of the Personal Data Protection Bill 2018 with the General Data Protection Rules : Part II – Right to Confirmation and Access

Posted on December 1, 2018November 12, 2019 by Prateek Surisetti

INTRODUCTION TO SERIES

The Personal Data Protection Bill has garnered a fair degree of attention in the last few weeks. For the uninitiated, a brief description of the Bill and its significance can be found here.

The purpose of this series is to analyze the bare text of the Data Principal Rights espoused in the Bill (Chapter VI), namely the Right to Confirmation and Access, Right to Correction, Right to Data Portability and the Right to be Forgotten, in light of the text used in the European legislations to espouse the same values. Each post will deal with each of the above rights.

Part I of the series can be accessed here.

INTRODUCTION TO POST

Over the course of the ensuing section, I shall contrast the text of the Confirmation and Access provisions of the (PDPB) Personal Data Protection Bill (India) (S. 24) with the corresponding provisions of the (GDPR) General Data Protection Regulation (European Union) (Art. 15).

For the purposes of convenience, I have reproduced the relevant provisions below. (Emphasis supplied)

Personal Data Protection Bill (India)

“24. Right to confirmation and access. —

(1) The data principal shall have the right to obtain from the data fiduciary—

(a) confirmation whether the data fiduciary is processing or has processed personal data of the data principal;
(b) a brief summary of the personal data of the data principal being processed or that has been processed by the data fiduciary;
(c) a brief summary of processing activities undertaken by the data fiduciary with respect to the personal data of the data principal, including any information provided in the notice under section 8 in relation to such processing activities.

(2) The data fiduciary shall provide the information as required under this section to the data principal in a clear and concise manner that is easily comprehensible to a reasonable person.…

General Data Protection Regulation (European Union)

“Article 15

Right of access by the data subject

  1. The data subject shall have the right to obtain from the controller confirmation as to whether or not personal data concerning him or her are being processed, and, where that is the case, access to the personal data and the following information:

(a)  the purposes of the processing;

(b)  the categories of personal data concerned;

(c)  the recipients or categories of recipient to whom the personal data have been or will be disclosed, in particular recipients in third countries or international organisations;

(d)  where possible, the envisaged period for which the personal data will be stored, or, if not possible, the criteria used to determine that period;

(e)  the existence of the right to request from the controller rectification or erasure of personal data or restriction of processing of personal data concerning the data subject or to object to such processing;

(f)  the right to lodge a complaint with a supervisory authority;

(g)  where the personal data are not collected from the data subject, any available information as to their source;

(h)  the existence of automated decision-making, including profiling, referred to in Article 22(1) and (4) and, at least in those cases, meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject.

  1. Where personal data are transferred to a third country or to an international organisation, the data subject shall have the right to be informed of the appropriate safeguards pursuant to Article 46 relating to the transfer.
  2. The controller shall provide a copy of the personal data undergoing processing. For any further copies requested by the data subject, the controller may charge a reasonable fee based on administrative costs. Where the data subject makes the request by electronic means, and unless otherwise requested by the data subject, the information shall be provided in a commonly used electronic form.
  3. The right to obtain a copy referred to in paragraph 3 shall not adversely affect the rights and freedoms of others.

ANALYSIS

The right provides “data subjects”/ “data principals” (varying terms used by the GDPR and PDPB respectively for referring to natural persons to whom the data relates to) with the authority to demand from “controllers”/ “data fiduciaries” (varying terms used by the GDPR and PDPB respectively for referring to entities which determine the purpose and means of processing of data), dealing with the data subject’s personal data, certain information pertaining to the personal data. The right ensures that there exists lesser information asymmetry between those to whom the personal data pertains and those who are processing or controlling said data. Refer here for a summary.

At first glance, the Indian draft-legislation’s provision “Right t­­­­­­­o Confirmation and Access” (S. 24) might seem to be rather abstract and vague in comparison to its European counterpart, but closer inspection reveals that both are quite similar. While the GDPR provides guidelines within a mostly self-contained provision, the PDPB’s S. 24 cross-references S. 8, which contains the list of necessary information disclosure obligations placed on the “data fiduciary”.

Though there exists considerable degree of similarity, in text, between both the jurisdictions, certain distinctions in orientations are quite evident from the language of the provisions.

The Indian Bill, admirably, places explicit emphasis on the accessibility of disclosures. S. 24 (2) mandates that the disclosures be “easily comprehensible”. Wherever there exists a power imbalance, those with access to expertise and other resources are better placed to abuse the system through indulging in complex legalities. Such statutory protections reduce the likelihood of resource-rich (access to expertise & infrastructure) “fiduciaries” utilizing complexity to overwhelm citizens incapable of processing technical information.

Furthermore, the Indian draft-legislation requires a “brief summary” (necessarily disclosing the statutorily prescribed information), as opposed to its European counterpart, which doesn’t place any such requirement. The legislative intent behind the same seems to be consistent with the logic of accessibility (prevent provision of information that cannot be processed meaningfully) mentioned above.

Listing the specific data that needs to be disclosed could enable “fiduciaries” to utilize the provision as an avenue to avoid disclosure of other unlisted, but relevant information. I submit that an additional sub-section requiring disclosure of all relevant information over and above the statutorily mandated disclosures (a general overarching clause, in addition to the prescribed disclosure requirements) would have tilted the balance favourably towards data privacy.

Additionally, the Indian Bill doesn’t seem to be placing as much significance on profiling (processing of personal data for analyzing or predicting data subject’s behavior, characteristics, location, etc.; the GDPR’s Art. 4(4) and PDPB’s S 2 (33) define the term in varying detail but essentially, the definitions are of similar import) as its European counterpart. Though the PDPB refers to profiling and allied restrictions across the Bill, it lacks mention in Chapter VI (Data Principal Rights). Even upon analyzing the entirety of the documents, the EU legislation tends to be placing greater restrictions on profiling than PDPB. The Indian Bill, has instead, preferred allowing profiling subject to an assessment (S. 33: “Data Protection Impact Assessment”) carried out by the Data Protection Authority of India (established under Chapter X of the Bill).

Lastly, the European legislation (Art. 20(4)) clarifies that the request for information as a matter of right cannot be in abrogation of other’s “rights and freedoms”. Though S. 27(2) of the PDPB refers to balancing of rights in the context of “Right to Be Forgotten”, S. 24 doesn’t refer to any form of weighing of rights. Given that there could be numerous varied instances of legitimate conflicting rights, allowing the judiciary to decide on a case by case basis seems to point towards prudence.

 

Image taken from here.

Read more

Bare Text Comparison of the Personal Data Protection Bill 2018 with the General Data Protection Rules : Part I – Right to Data Portability

Posted on December 1, 2018November 12, 2019 by Prateek Surisetti

INTRODUCTION TO SERIES

The Personal Data Protection Bill has garnered a fair degree of attention in the last few weeks. For the uninitiated, a brief description of the Bill and its significance can be found here.

The purpose of this series is to analyze the bare text of the Data Principal Rights espoused in the Bill (Chapter VI), namely the Right to Confirmation and Access, Right to Correction, Right to Data Portability and the Right to be Forgotten, in light of the text used in the European legislations to espouse the same values. Each post will deal with each of the above-mentioned rights.

INTRODUCTION TO POST

Over the course of the ensuing section, I shall contrast the text of the Data Portability related provision of the (PDPB) Personal Data Protection Bill (India) (S. 26) with the corresponding provision of the (GDPR) General Data Protection Regulation (European Union) (Art. 20).

For convenience, I have reproduced the relevant provisions below (emphasis supplied) and readers would be benefited from constantly referring to the bare text whenever necessary.

Personal Data Protection Bill (India)

“26. Right to Data Portability. —

(1) The data principal shall have the right to—

(a) receive the following personal data related to the data principal in a structured, commonly used and machine-readable format—

(i) which such data principal has provided to the data fiduciary;
(ii) which has been generated in the course of provision of services or use of goods by the data fiduciary; or
(iii) which forms part of any profile on the data principal, or which the data fiduciary has otherwise obtained.

(b) have the personal data referred to in clause (a) transferred to any other data fiduciary in the format referred to in that clause.

(2) Sub-section (1) shall only apply where the processing has been carried out through automated means, and shall not apply where—

(a) processing is necessary for functions of the State undersection 13;
(b) processing is in compliance of law as referred to in section 14; or
(c) compliance with the request in sub-section (1) would reveal a trade secret of any data fiduciary or would not be technically feasible. ”

General Data Protection Regulation (European Union)

“Article 20

Right to data portability

  1. The data subject shall have the right to receive the personal data concerning him or her, which he or she has provided to a controller, in a structured, commonly used and machine-readable format and have the right to transmit those data to another controller without hindrance from the controller to which the personal data have been provided, where:

(a)  the processing is based on consent pursuant to point (a) of Article 6(1) or point      (a) of Article 9(2) or on a contract pursuant to point (b) of Article 6(1); and

(b)  the processing is carried out by automated means.

  1. In exercising his or her right to data portability pursuant to paragraph 1, the data subject shall have the right to have the personal data transmitted directly from one controller to another, where technically feasible.
  2. The exercise of the right referred to in paragraph 1 of this Article shall be without prejudice to Article 17. That right shall not apply to processing necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller.
  3. The right referred to in paragraph 1 shall not adversely affect the rights and freedoms of others.”

ANALYSIS

Prior to analyzing the distinctions between the legislations, let us briefly understand the concept of data portability and its significance. Data portability requires entities that process data to provide collected personal data in a format that is interoperable across platforms. This prevents entities in control of personal data (e.g. Facebook) to hold one’s personal data hostage to a particular platform by way of avoiding to provide data in a format that can be used elsewhere (e.g. other social media websites).

Firstly, I submit that the exception of technical infeasibility (present in both the GDPR & PDPB) deserves criticism (refer double-underlined portion above). As Lawrence Lessig argued in Code 2.0, technological infeasibility shouldn’t be allowed to override a value system. In order to understand the argument further, let us (a) delve into the statute’s meaning and then (b) analyze the issues with technological infeasibility as an exception to the right.

Both the legislations conceptualize technological feasibility as an exception to the right. Conceptually, they place technology in a position that is conceptually superior to the right itself because the absence of technical feasibility would render the right nugatory. Now, let us move on to (b) analyzing the issues with technological infeasibility as an exception.

There exists a certain value system that our laws espouse. A value system, quite eponymously, is an aggregate of various values and ideals. Once a certain society decides to embrace a particular set of values and ideals i.e. a particular value system (whatever it may constitute), technology shouldn’t be allowed to hinder or steer the furtherance of the said value system. Allowing technological infeasibility to render a right redundant could lead to technological development being divorced from the embraced value system.

The question boils down to whether technology should be allowed to circumscribe the value system, or whether the value system should render the technology invalid? I argue, as did Lawrence Lessig, for the latter. Having the value system render technologies inconsistent with it invalid through law (e.g. by removing technical infeasibility as an exception) would force engineers to develop technologies that are consistent with the value system, which society has chosen. Therefore, such a model orients technological development in the direction of the value system that society has chosen for itself and cherishes, as opposed to a parallel value system. In other words, engineers should have the burden of structuring technology according to the ideals chosen by society, rather than the other way round where society adapts to the values and ideals furthered by the technology developed by engineers (a fraction of society).

Moving on, the EU legislation explicitly clarifies that the right doesn’t exist in abrogation of other’s rights and freedoms (refer underlined portion above). However, the Indian PDPB doesn’t provide any clarification regarding the enforcement of the “data portability” right of an individual vis-a-vis others’ rights and freedoms. Consequently, courts would have to make judgment calls as and when issues, involving a conflict between a “data principal’s” (natural person to whom the personal data relates to; the GDPR uses the term “data subject” instead) right to data portability and other’s rights or freedoms, arise.

Lastly, a difference can be noticed in the manner in which the scope of the right has been framed in either legislation (refer portion in bold above). The PDPB entitles subjects to receive and transfer their personal data solely where the data processing is “automated” (PDPB’s S. 2(7) defines “automated means” as “equipment capable of operation automatically in response to instructions given for the purpose of processing data”). Further narrowing the right’s scope, the PDPB provides exceptions, such as for protection of trade secrets, technical infeasibility, for being in compliance with another law and for furtherance of essential State functions. On the other hand, the GDPR provides the right in instances apart from automated processing, in addition to where the processing is based on consent (Art. 6(1)(a); “consent” defined under Art. 4(11)) and pre-contract obligations (Art. 6(1)(b); means according to a contract). Also, the GDPR provides a slightly different set of exceptions, namely, for technical infeasibility, public interest and exercise of the authority vested in the controller. Additionally, the GDPR provides that the right of data portability cannot prejudice the right to be forgotten (Art. 17). Therefore, it’s quite clear that the legislations are substantially different in demarcating the scope of the data portability right, but an analysis of greater rigour is required for ascertaining the actual ambit of the provisions.

 

Image taken from here.

Read more

A Perfect Eden

Posted on November 22, 2018November 22, 2018 by Tech Law Forum @ NALSAR

[Ed Note : The following post has been authored by Anupriya Nair, a second year student of NALSAR University of Law. In an interesting and chilling read, Anupriya talks about the potential emergence of China-inspired social credit systems in India which essentially monitor our actions to tell us how trustworthy we are. What exactly does this entail? Read to find out more!]

Unlocking Novel Frontiers of Digital Control: The Potential Emergence of Social Credit Systems in India

Development of technology has begun to tread the fine line between liberation and oppression of society. In other words, the ever-evolving digital sphere has led us to face the paradox of having means to achieve new levels of inclusivity (liberation) while running an exponentially large risk of highly intrusive surveillance (oppression).

This dilemma was addressed in Charlie Brooker’s dystopian series Black Mirror. In an episode “Nosedive”[1], Brooker depicted a society in which every member possessed a personal score ranging anywhere between 0 to 5. These personal scores were determined based on rankings from people who viewed the member’s profile and rated their posts. Further, a change in this score could result in significant socioeconomic consequences.

Given the importance of the score to the quality of life of an individual in society, every human interaction was transformed into an exercise of disingenuous camaraderie, for fear that a stray remark would result in a poor rating, creating a world where everybody strived to be trustworthy and respectful towards one another, creating a perfect Eden.

This perfect Eden could be a reality for China by 2020. The Communist Party, with the aim of building a socialist utopia under its able guidance, has been developing a social credit system in which it intends to inculcate a culture of “trustworthiness” and “sincerity” into its society.

This system of social credit would involve the government monitoring every digitally traceable action of an individual, making it a powerful force that collects copious amounts of sensitive information on nearly every interaction made by an individual. The system would consequently assign each individual a numerical score that acts as a direct indicator of one’s “trustworthiness”.

One of the most prominent state-approved pilot projects currently in place is run by Zhima Credit (Sesame Credit), the subsidiary financial wing of the world’s biggest online shopping platform, Alibaba. Users of the Alibaba mobile app may voluntarily request to be provided with a social credit score based on not only their credit history, but their behaviour as well.

The need for such a social credit system arises out of the lack of a traditional functioning credit system that is generally built based on mortgage and credit card bill payment patterns of individuals. In China however, consumers primarily use cash and the country’s central banking regulator (The People’s Bank of China) doesn’t maintain adequate financial records of their consumers either. Since adhering to the traditional mode of credit scoring is not a viable option for the citizens of China, they decided to opt for other means of determining their credit risk. The Zhima system thus has a large number of citizens volunteering to avail the social credit facility provided by Alibaba. A poor Zhima score cannot get a citizen blacklisted, given that the government concluded that it would not be permissible to allow a private corporation to have control over such sensitive areas.

China, in addition to the eight firms authorized to conduct such alternative credit score programmes, has a local government approved social credit score regime in place as well. Although the government contends that the regime has been designed to be “objective” in nature, it ultimately draws a parallel to the understanding of what constitutes “good” and “bad” behaviour according to the government. Further, the scores in this regime operate on a 1000-point scale and can have an impact on the socioeconomic benefits available to a person, their implications ranging from an individual’s opportunity to apply for a government job, to sending their children to an elite private school.[2] The scores are therefore an omnipotent, omnipresent and omniscient force to be reckoned with.

As stated in a high-level policy document released in September, the overriding principle that this social credit regime aims to follow at its core is: “If trust is broken in one place, restrictions are imposed everywhere.”[3]

Some elements of the social credit system appear to be making its way to India with the Income Tax department reportedly chalking out a new policy where “honest” and consistent taxpayers will be rewarded. As per the proposal by the Central Board of Direct Taxes, honest taxpayers are to receive priority treatment in accessing public services at places such as airports or railway stations. According to the Press Trust of India, “honest” taxpayers could be issued special identification numbers or be flagged as a special part of the maiden taxpayer facilitation proposal in their permanent account number (PAN).

Evidently, apart from creating a metric to determine one’s credit score, the primary vision of the implementation of a social credit system is to strive to achieve a utopian future for society. The question is, at what cost are we willing to adopt to this process of Eden-ification? Just as the Aadhar has previously been labelled as a mass surveillance tool, a social credit system would involve the collection and storage of highly sensitive personal information which could indeed become a target for hackers as previously demonstrated by various flaws and reported hacks within the Aadhar database itself.

Apart from the surveillance and privacy concerns, there is also the possibility that this system would imbibe a sense of disingenuity in its users. The best course of action inevitably involves understanding the “objective” system and using it to your advantage. This results in a number game of sorts where everyone is after a higher score instead of genuinely striving to become a better person out of one’s own volition.

Finally, the standards set in a social credit system cannot be “objective” given that the quality being standardised is trustworthiness. There is no objective panel from society or democratic process being utilised to set the standards of “trustworthiness” or “socially acceptable behaviour” in society. This is obviously a wrongful imposition of power. Further, those involved in the actual creation of these “objective” standards have an unfair advantage in earning a higher score due to their proximity to the programme itself.

In conclusion, it is not wrong to strive to build a perfect Eden for ourselves. The issue lies with the highly problematic and abuse-prone means by which we intend to reach our goal of doing so.

References – 

[1] Joe Wright, (Director). (2016, October 21). Nosedive [Television series episode], In Laurie Borg (Producer), Black Mirror. Netflix.

[2] Alice Vincent, Black Mirror is coming true in China, where your “rating” affects your home, transport and social circle, The Telegraph, Dec. 15, 2017, https://www.telegraph.co.uk/on-demand/2017/12/15/black-mirror-coming-true-china-rating-affects-home-transport/ (last visited Nov 21, 2018)

[3] China’s plan to organize its society relies on ‘big data’ to rate everyone – The Washington Post, https://www.washingtonpost.com/world/asia_pacific/chinas-plan-to-organize-its-whole-society-around-big-data-a-rating-for-everyone/2016/10/20/1cd0dd9c-9516-11e6-ae9d-0030ac1899cd_story.html?utm_term=.76106c42e93e (last visited Nov 21, 2018)

Read more

Comments on the Srikrishna Committee Report and the Draft Data Protection Bill 2018 – V

Posted on October 16, 2018October 16, 2018 by Tech Law Forum @ NALSAR

[Ed Note : The following post, the fifth post in the series of posts containing comments to the Report and Draft Bill, 2018  published on the MeitY website, has been authored and compiled by students of NALSAR University of Law. This post contains comments on data localisation framework put forth by the Committee.
The first post in the series can be found here.]

The Data Protection Bill under Section 41 mandates any data fiduciary to store personal data of all data principals in India. It also requires companies process and store all critical personal data only in servers or data centers located in India. This requirement is colloquially known as ‘Data Localisation.’ The report justifies data localisation on several grounds such as easy enforcement, increase in compliance, reduction of foreign surveillance, among others. The following paper will discuss briefly the reasons provided by the Report, it will then critically evaluate the claims, and arguments made by the Committee. It will conclude by arguing against a requirement for data localisation.

Why did the Committee choose mandatory data localisation? 

This section will be providing, at the cost of reiteration, the arguments presented by the Report. The Committee initiates its argumentation for data localisation on the ground that law enforcement agencies (LEA) require access to information for the detection of crimes as well as gathering evidence for prosecution. The presence of local copy of the personal data, according to the report, would allow for quicker and more efficient enforcement of laws in India. Presently, eight out of ten of the most accessed websites in India are based in the United States. However, none of these companies have offices in India. Acquiring data from any of these companies is a long and onerous process. The availability of the information is based on the presence of a bi-lateral or a Multi-lateral Agreement Treaty in this specific regard. The requests are passed through several agencies such as the court to the ministry of external affairs to the courts in the foreign jurisdiction before reaching the company. The according to the report is a highly bureaucratic process to access the information, if at all the requests are fulfilled.

The Report also argues that the this would reduce dependency on the fibre-optic undersea cable network this reducing the chances of being vulnerable to attacks. Holding critical information related to the nation inside the borders of the country is necessary for the healthy functioning of the country’s economy among others. The AI ecosystem that the NITI Aayog wishes to develop will receive a massive boost through this move. The growth of AI is directly linked to amount of data available within the jurisdiction, which will be necessary for the development of a local infrastructure. The creation of a digital infrastructure requires this move according to the Report.

It also argues that the chances of foreign surveillance of Indians also reduce. Post the Snowden revelations, the lack of any safeguards for the data of foreigners became clear since the United States legislations do not protect data of foreigners with companies storing their data there.

The Report argues that the cost of storing data in India may not be high, if not for large service providers but for small and medium size service providers. It argues that the costs, firstly, will be worth the spend because of the size of the Indian market and secondly, the cloud storage options available to smaller companies would increase if all data was stored in India.

Lastly, the Report argues that the fears regarding online censorship and chilling effect on free speech are entirely misplaced. It argues that there are other methods of the restricting free speech such as internet shutdowns. That for such restrictions to be possible, it has to be placed in a context of a dysfunctional data law coupled with government intention to use the same. That the images of a completely walled internet similar to China is a caricatured version presented of a post-data localisation web. That the internet in countries has always been shaped by the local context of that country.

What are the problems with mandatory data localisation? 

While the arguments in relation to enforceability are well-taken, there are several problems, as this section will argue with regards to mandatory data localisation. While better enforcement of internet related offences is certainly a benefit. The overall benefits of the move however, do not justify the introduction of this move to India.

The argument that India’s dependence on fibre-optic cables will be reduced and allow it to function in times crisis does not hold good. While it is not argued that critical information regarding people’s medical data, financial data and biometrics are not shared outside. This information is necessary for the better functioning of the country especially in times of crisis. However, forcing companies to store local copies of the personal data of individuals does not serve the purpose at any level. This would not reduce the dependency on the cables, since the critical infrastructure being used to process, and compute the data will be available only at the company’s headquarters.[1] The information stored in India will be completely futile.

The AI ecosystem is unlikely to be affected by a mandatory data localisation policy, if anything it could prevent newer companies from coming and testing their products in India. The data stored in India by companies will always be owned by them. The data stored by one company is not transferable to another merely by the virtue of the fact that the data is stored in India. The AI developed by any company will while collecting data be dependent on the proprietary software and hardware of that company or by getting into agreements with other companies to transfer the information to them. The AI will only learn on the basis of the data that is provided to them according to the economic capabilities and interests of the company.[2] If the company wishes to enter the Indian market it will do so by either gathering data by itself or by buying information from the companies that have data on Indians.[3] This transaction can be completed with or without a company necessarily having to store data about Indians in India.

Smaller firms will certainly have options of choosing many cloud storage services in India. However, this will be an additional cost on them regardless. The way smaller companies find foothold in market abroad is through organic marketing wherein they initially gain visibility in a market and then start developing a user-base. The company then chooses to develop or not develop products, if necessary for that market if it seems viable to operate in those markets.[4] This is a pragmatic schema that companies would follow to ensure that their services reach the largest number of people without any additional burden on their operations.[5] As an analogy, if a car manufacturer would have to set up an office in India if even one of its cars was being sold in India. They would simply avoid selling to Indians given cost incurred and the benefits incurred.[6] This is likely to prevent any company from wanting to operate in India.

The argument made about foreign surveillance is flawed since it only looks at one side of the Snowden revelations. The revelations certainly showed that data stored in United States could be accessed at any point. However, the data stored in foreign jurisdictions was equally vulnerable according to the revelations.[7] Lastly, the fears regarding the ability of the government in being able to censor content and the chilling speech are not misplaced.[8] There may be several tools in the governmental arsenal to use and restrict discourse on the internet.[9] However, this does not justify providing more tools to the government in restricting the speech. The enforcement of laws is going to be tougher through MLAT’s is likely to be tough, however, it is a short-sighted move. Especially, given the context in which the government has consistently and unreasonable used internet shutdowns, among other means to curb free speech.[10]

The Internet as it was originally envisaged was developed was to ensure that there is a free-flow of information.[11] This formed the basis of the entire architecture of the network in its initial days.[12] Surely, the manner in which these systems have developed have changed the negotiation and manner of functioning of the internet.[13] However, the primary infrastructure of connecting multiple people remains. This structure of the internet was such to ensure that it stays efficient even scaled and it was this that allowed the internet to grow to the extent that it has today. Forcing companies to store data in India is likely to disrupt this model and prevent any viable growth of the network organically.

Companies in western countries had a head start with regards to the usage of internet because it developed in those countries. Consequently, they were also able to develop better hardware and software to secure the data stored in their servers. This data simply cannot be transferred to any other place suddenly.[14] The transfer of technology with respect to these may take years, not only in terms of legal and economic barriers. There are logistical barriers in setting-up such infrastructure in India. The economic costs itself will disincentivise companies from investing money in India. The Indian market may be large, but this is not enough for anyone to invest in developing such infrastructure in India. Companies may especially be reticent in moving to if the electrical, and technological infrastructure is not well-developed.

[1] Thomas Schultz, ‘Carving up the Internet: Jurisdiction, Legal Orders, and the Private/Public International Law Interface’ (2008) 19 Europe. J. Int’l L. 779

[2] X. Wu, X. Zhu, G. Wu and W. Ding, “Data mining with big data,” in IEEE Transactions on Knowledge and Data Engineering, vol. 26, no. 1, pp. 97-107, Jan. 2014.

[3] A. L. Buczak and E. Guven, “A Survey of Data Mining and Machine Learning Methods for Cyber Security Intrusion Detection,” in IEEE Communications Surveys & Tutorials, vol. 18, no. 2, pp. 1153-1176, Secondquarter 2016.

[4] Delivering Digital Infrastructure – Advancing the Internet Economy Report, April 2014, World Economic Forum.

[5] Helena Ursic; Bart Custers, Legal Barriers and Enablers to Big Data Reuse, 2 Eur. Data Prot. L. Rev. 209 (2016).

[6] Kritika Bhardwaj, Data localisation must go, it damages the global internet, https://www.hindustantimes.com/analysis/data-localisation-must-go-it-damages-the-global-internet/story-Aah1052ExFq6Ylcb9BQ4jJ.html, Hindustan Times, August 03, 2018

[7] Reema Shah, “Law Enforcement and Data Privacy – A Forward-Looking Approach” (2015) 125:2 Yale LJ 543.

[8] Hogan, Mél, and Tamara Shepherd. “Information Ownership and Materiality in an Age of Big Data Surveillance.” Journal of Information Policy 5 (2015): 6-31.

[9] T. Maurer, I. Skierka, R. Morgus and M. Hohmann, “Technological sovereignty: Missing the point?,” 2015 7th International Conference on Cyber Conflict: Architectures in Cyberspace, Tallinn, 2015, pp. 53-68.

[10] Gautam Bhatia, Free Speech Watch, https://indconlawphil.wordpress.com/free-speech-watch/, Indian Constitutional Law and Philosophy; Alexander Plaum, ‘The Impact of Forced Data Localisation on Fundamental Rights’ (Access now 4 June 2014) <https://www.accessnow.org/the-impact-of-forced-data-localisation-on-fundamental-rights/> accessed 15 Feb 2018.

[11] Monroe Price and Stefaan Verhulst, ‘The concept of self-regulation and the internet’ in J. Waltermann & M. Machill (Eds.), Protecting our children on the internet: Towards a new culture of responsibility (Bertelsmann Foundation Publishers 2000) <https://repository.upenn.edu/asc_papers/142/> accessed 15 Feb 2018.

[12] Fraser, E. (2016). Data localisation and the balkanisation of the internet. SCRIPTed: Journal of Law, Technology and Society 13(3), 359-373.

[13] Cyber-Physical Systems <https://www.nist.gov/el/cyber-physical-systems> accessed 15 Feb 2018.

[14] Erica Fraser, Data Localisation and the Balkanisation of the Internet, (2016) 13:3 SCRIPTed 359 <https://script-ed.org/article/data-localisation-and-the-balkanisation-of-the-internet/> accessed 15 Feb 2018.

Read more

Comments on the Srikrishna Committee Report and the Draft Data Protection Bill 2018 – IV

Posted on October 16, 2018December 1, 2018 by Tech Law Forum @ NALSAR

[Ed Note : The following post, the fourth post in the series of posts containing comments to the Report and Draft Bill, 2018  published on the MeitY website, has been authored and compiled by students of NALSAR University of Law. This post contains comments on the amendment to Section 8(1)(j) of the RTI Act, 2005 that has been proposed by the Committee. 

The first post in the series can be found here. Keep watching this space for more posts!]

Transparency and accountability in a government and its administration is an indispensable part of a participatory democracy. Information is the oxygen for the survival of a democracy. The Right to Information Act was passed in 2005 replacing the Freedom of Information Act, 2002 so that every citizen has the right to access information controlled by public authorities. RTI is intrinsic to good governance and a necessity for democratic functioning.

The Data Protection Bill and the Data Protection Report made by the B.N. Srikrishna Committee in 2018 has proposed amendments to the Right to Information Act. It proposes a change in section 8(1)(j) of the Act. The proposed amendment is –

“information which relates to personal data which is likely to cause harm to a data principal, where such harm outweighs the public interest in accessing such information having due regard to the common good of promoting transparency and accountability in the functioning of the public authority;

Provided, disclosure of information under this clause shall be notwithstanding anything contained in the Personal Data Protection Act, 2018; Provided further, that the information, which cannot be denied to the Parliament or a State Legislature shall not be denied to any person.”

The reason behind such change as mentioned in the report is that the current version of section 8(1)(j) does not indicate what would constitute an unwarranted invasion of privacy and that the amendment would solve this problem. The report states that – “A lot of information sought from a public authority may contain personal data of some kind or another. Further a strict interpretation of purpose limitation may give rise to the inference that any disclosure other than for the purpose for which the personal data was submitted would lead to an unwarranted invasion of privacy.” According to the report, information should not be disclosed in ‘exceptional circumstances,’ where the likelihood of harm from the disclosure outweighs the common good of transparency and accountability in functioning of public authorities. The report mentions that by the proposed amendment a more precise balance would be created between Right to Information and Right to Privacy.

However, it is submitted that the amendment are problematic for a number of reasons which are enlisted below –

  • The Data Protection Report mentions that there needs to be harmonisation of Right to Information and Right to Privacy, and that a balancing act should be performed of reconciling both rights. However, the proposed amendment fails to achieve this goal, as it compromised on the Right to Information. The Report cites the case of Thalapallam Ser. Coop. Bank Ltd. State of Kerala[1], in which the Court said both these rights have to balanced in terms of public interest. In order to balance right to information and right to privacy of public officials performing a public duty as per the parameters of public interest, it is to be understood that right to information takes more importance. In order to achieve this balance Supreme Court laid down three tests in the case of Supreme Court vs Subhas Chandra Agrawal which is –
  1. whether the disclosure of the personal information is with the aim of providing knowledge of the proper performance of the duties and tasks assigned to the public servant in any specific case;
  2. whether the information is deemed to comprise the individual’s private details, unrelated to his position in the organization, and
  3. whether the disclosure will furnish any information required to establish accountability or transparency in the use of public resources.

If these tests are satisfied then the information has to be disclosed. It is important to understand that the degree of right to privacy available to public officials when performing public duty is lower than the right to privacy available to a private person in general circumstances.[2] Therefore, right to information has to be more important in the specific context of public officials when doing public duty than right to privacy. However, when we read the amendments proposed, they clearly elevated the right to privacy to a higher pedestal over the right to information, which runs antecedent to the decision of the Supreme Court. Therefore, the amendments are not to be implemented. Apart from the fact, that the amendment does not adhere to constitutional principles as discussed above, this in itself is very problematic which is discussed below.

  • Section 8(1)(j) of the Right to Information Act, 2005 states that when personal information is asked for which has no relation to any public interest or activity or infringes upon the privacy of a person unnecessarily then the information need not be disclosed. This section also sets an acid test to determine whether information has to be disclosed or not by stating that if the information can be disclosed to the Parliament or a State Legislature then it has to be disclosed to the public at large as well.

This section by its virtue has enough safeguards within itself to prevent any unnecessary information to be disclosed that could infringe upon the privacy of an individual and meet the standards of privacy that have recently developed, if used properly. However, this section has been misused and has been weaponised against the very idea of having the RTI act, which was to curb corruption and create transparency. This is said because of the Supreme Court judgement in Girish Ramachandra Deshpande vs CIC[3], wherein the court ignored this proviso and precedents and laid down that assets and details of the public servant constituted as personal information. This case has had a chilling effect on several cases like R.K Jain vs Union of India[4] and Canara Bank vs CS Shyam [5] to name a few. So, the RTI Act which was capable enough to balance public officials right to information and to right to privacy got distorted due to the Girish Ramchandra Deshpande case resulting into the dilution of the RTI Act and diminishing its purpose of creation.

  • Now, when we consider the amendment which has been brought in by the Draft Data Protection Bill, 2018 modifies the section 8(1)(j) to – “This amendment further dilutes the RTI Act and it will totally diminish the purpose of having the RTI Act. This is stated because of the following –

Firstly, the phrase “relates to personal data” makes the ambit of information that can fall within this to be extremely wide and vague. The Draft Bill does not define ‘personal data,’As a result, anything and everything can be included as relating to personal data. The consequence of this will be that every time information is sought about a public official, this section will come into application and information could be withheld from disclosure. Without any violation of privacy, information would be withheld simply because the amendment uses as broad a phrase as ‘personal data’ without defining it. This non-disclosure of the information without any infringement of privacy will result into a violation of the right to know which is enshrined in Article 19(1)(a) of the Constitution.[6]

Secondly, the standard has been set to “likely to cause” which is an extremely low standard. The mere possibility that the disclosure of information may cause harm is too low a standard to follow. It will deter PIOs from disclosing any information because there might be a remote possibility of causing harm. They would be over cautious of disclosing information and will be reluctant to disclose any information. A standard so low cannot be followed for an act like RTI which is so fundamental to bringing transparency and accountability in the functioning of the government. Using of such a low standard will deny the right to information to citizens.

Thirdly, the word “harm” is very broadly defined by the draft bill in its section 3(21). It includes ‘bodily injury’, ‘mental injury’, ‘loss of reputation’ among other things. Inz practicality, information exposing corruption activities will definitely bring a loss of reputation to the concerned person. Making this as a ground to not expose the corrupt official makes no logical sense. Also, ‘mental injury’ cannot be a ground to not expose a corrupt official. These grounds have no rational basis. It is equivalent to saying that someone who commits a crime should not be punished because it would be affect him physically and mentally and lower his reputation in the society.

Hence, the amendment proposed should not be accepted as it would completely water down the RTI Act,2005 and render it ineffective.

4)   On one hand, the proposed amendment sets the threshold of “likely to cause harm to a data principal, where such harm outweighs the public interest” which must be adhered to when determining whether there should be disclosure. However, the proposed amendment also retains the proviso to the old Section 8(1)(j) of the RTI Act which says that “information, which cannot be denied to the Parliament or a State Legislature shall not be denied to any person.” Hence, while the first part of the proposed amendment proposes to set a different threshold as compared to the previous Section 8(1)(j), the “acid test” of the proviso retained is that of 8(1)(j) itself. Hence, the proposed amendment is contradictory in itself, there being two different thresholds present in it – one threshold is borrowed from the old Section 8(1)(j) and the other new threshold introduced.

Further, the PIOs under the RTI Act are not judicially trained and practically speaking, it is extremely difficult for a ground – level PIO to understand the complex concept of privacy and the jurisprudence associated with it. Keeping this in mind, the contradictory thresholds set by the proposed amendment further complicate the already – complex process of interpretation for the PIOs. This complication will lead to unintended consequences wherein a PIO may disclose information that was intended to be denied and deny information that was intended to be disclosed. Thus, it is proposed that the different thresholds will defeat the purpose of the amendment itself and the amendment must be corrected to that extent.

  • There exists consistent pro-disclosure jurisprudence regarding disclosure of information of election candidates. In the case of Union of India v Association for Democratic Reforms & Anr[7] the court held that citizens have a right to know about the assets of those who want to stand for elections (become public servants). This was followed by the case of PUCL v. Union of India[8] wherein the court struck down Section 33(b) of the Representation of People’s Act 1951 which imposed certain restrictions on disclosure of information by elected candidates, declaring it to be beyond the legislative competence of the Parliament in view of the directions issued in the ADR Case.

In keeping with this pro – disclosure jurisprudence is the recent case of Lok Prahari v. Union of India[9], wherein the court ruled in favour of asset – disclosure of election candidates saying that, “If assets of a Legislator or his/her associates increase without bearing any relationship to their known sources of income, the only logical inference that can be drawn is that there is some abuse of the Legislator’s Constitutional Office. Something which should be fundamentally unacceptable in any civilized society and antithetical to a constitutional Government. It is a phenomenon inconsistent with the principle of the Rule of Law and a universally accepted Code of Conduct expected of the holder of a public office in a Constitutional democracy.” (emphasis added)

It is thus seen that the court has consistently upheld the disclosure of assets of not only election candidates but also their associates. Obviously, if citizens have a right to know about the assets of those who want to become public servants, the threshold of their right to get information about those who are already public servants cannot be lesser. Keeping this is mind, the proposed amendment goes against the established threshold regarding asset disclosure and thus it is proposed that it must be modified to take into account firmly established jurisprudence.

 

  • The Right to Privacy and the Right to Information both trace their origins to Article 19(1)(a) of the Constitution of India, the exceptions to which are present in Article 19(2). The same has also been elucidated on page 45 of the Law Commission’s Report on the Public Interest Disclosure Bill 2001.

One of the exceptions listed under Article 19(2) is “defamation”. “Defamation” includes certain defenses like truth, fair comment, privilege etc.

In this light, it is important to note that the definition of “harm” as per Section 3(21) of the Draft Bill (which is also to be used in the proposed amendment to the RTI Act) includes “loss of reputation” and not “defamation”. “Loss of reputation” is much broader than “defamation” simply because the defenses that apply to defamation do not apply to it. Thus, the exception of imposed by the proposed amendment is broader than the exception set out under Article 19(2) and is, to that extent, unconstitutional.

Hence, it is proposed that for the purposes of the RTI Act under which disclosure of information can be denied only under Article 19(2), the exception of “loss of reputation” under “harm” should be changed to “defamation” in line Article 19(2).

Therefore, for all the reasons mentioned above, which are that the Bill does not harmonize the two rights, problems with the words and phrases in the amendment, internal conflicts within the amendment, the proposed amendment being on a lower standard than set by the Supreme Court in right to information and public disclosure cases, and its scope wrongfully extending beyond the restrictions mentioned under article 19(2). It is proposed that the RTI Act should not be amended and Section 8(1)(j) should remain as it is presently.

However, it is important to mention on a broader note that contrary to the general perception, the Right to Privacy and the Right to Information are complementary and not contradictory to one another and must be presented as being so in the future in keeping with the Constitution and for the good of all the people in the country.

The next post can be found here.

[1] Thalapallam Ser. Coop. Bank Ltd. v. State of Kerala, (2013) 16 SCC 82

[2] Supreme Court of India vs Subhas Chandra Agarwal, (2011) 1 SCC 496.

[3]Girish Ramachandra Deshpande vs CIC, 2012 (119) AIC 105 (SC).

[4] R.K. Jain Vs. Union of India, 2013 (10) SC 430.

[5] Canara Bank vs CS Shyam, 2007 (58) AIC Ker 667.

[6] State of UP vs Raj Narain, 1975 AIR 865.

[7] Union of India v. Association for Democratic Reforms & Another, 2002 (5) SCC 294.

[8] People’s Union for Civil Liberties and Another vs Union of India and Another ,2003 (4) SCC 9.

[9] Lok Prahari vs Union of India, AIR 2018 SC 1041.

Read more
  • Previous
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • Next

Subscribe

Recent Posts

  • Lawtomation: ChatGPT and the Legal Industry (Part II)
  • Lawtomation: ChatGPT and the Legal Industry (Part I)
  • “Free Speech is not Free Reach”: A Foray into Shadow-Banning
  • The Digital Personal Data Protection Bill: A Move Towards an Orwellian State?
  • IT AMENDMENT RULES 2022: An Analysis of What’s Changed
  • The Telecommunications Reforms: A Step towards a Surveillance State (Part II)
  • The Telecommunications Reforms: A Step towards a Surveillance State (Part I)
  • Subdermal Chipping – A Plain Sailing Task?
  • A Comparative Analysis of Adtech Regulations in India Vis-a-Vis Adtech Laws in the UK
  • CERT-In Directions on Cybersecurity, 2022: For the Better or Worse?

Categories

  • 101s
  • 3D Printing
  • Aadhar
  • Account Aggregators
  • Antitrust
  • Artificial Intelligence
  • Bitcoins
  • Blockchain
  • Blog Series
  • Bots
  • Broadcasting
  • Censorship
  • Collaboration with r – TLP
  • Convergence
  • Copyright
  • Criminal Law
  • Cryptocurrency
  • Data Protection
  • Digital Piracy
  • E-Commerce
  • Editors' Picks
  • Evidence
  • Feminist Perspectives
  • Finance
  • Freedom of Speech
  • GDPR
  • Insurance
  • Intellectual Property
  • Intermediary Liability
  • Internet Broadcasting
  • Internet Freedoms
  • Internet Governance
  • Internet Jurisdiction
  • Internet of Things
  • Internet Security
  • Internet Shutdowns
  • Labour
  • Licensing
  • Media Law
  • Medical Research
  • Network Neutrality
  • Newsletter
  • Open Access
  • Open Source
  • Others
  • OTT
  • Personal Data Protection Bill
  • Press Notes
  • Privacy
  • Recent News
  • Regulation
  • Right to be Forgotten
  • Right to Privacy
  • Right to Privacy
  • Social Media
  • Surveillance
  • Taxation
  • Technology
  • TLF Ed Board Test 2018-2019
  • TLF Editorial Board Test 2016
  • TLF Editorial Board Test 2019-2020
  • TLF Editorial Board Test 2020-2021
  • TLF Editorial Board Test 2021-2022
  • TLF Explainers
  • TLF Updates
  • Uncategorized
  • Virtual Reality

Tags

AI Amazon Antitrust Artificial Intelligence Chilling Effect Comparative Competition Copyright copyright act Criminal Law Cryptocurrency data data protection Data Retention e-commerce European Union Facebook facial recognition financial information Freedom of Speech Google India Intellectual Property Intermediaries Intermediary Liability internet Internet Regulation Internet Rights IPR Media Law News Newsletter OTT Privacy RBI Regulation Right to Privacy Social Media Surveillance technology The Future of Tech TRAI Twitter Uber WhatsApp

Meta

  • Log in
  • Entries feed
  • Comments feed
  • WordPress.org
best online casino in india
© 2023 Tech Law Forum @ NALSAR | Powered by Minimalist Blog WordPress Theme