Tech Law Forum @ NALSAR

A student-run group at NALSAR University of Law

Menu
  • Home
  • Newsletter Archives
  • Blog Series
  • Editors’ Picks
  • Write for us!
  • About Us
Menu

Category: Personal Data Protection Bill

Facial Recognition and Data Protection: A Comparative Analysis of laws in India and the EU (Part I)

Posted on April 3, 2021April 3, 2021 by Tech Law Forum NALSAR

[This two-part post has been authored by Riddhi Bang and Prerna Sengupta, second year students at NALSAR University of Law, Hyderabad. Part II can be found here]

With the wave of machine learning and technological development, a new system that has arrived is the Facial Recognition Technology (FRT). From invention to accessibility, this technology has grown in the past few years. Facial recognition comes under the aegis of biometric data which includes distinctive physical characteristics or personal traits of a person that can be used to verify the individual. FRT primarily works through pattern recognition technology which detects and extracts patterns from data and matches it with patterns stored in a database by creating a biometric ‘template’. This technology is being increasingly deployed, especially by law enforcement agencies and thus raises major privacy concerns. This technology also attracts controversy due to potential data leaks and various inaccuracies. In fact, in 2020, a UK Court of Appeal ruled that facial recognition technology employed by law enforcement agencies, such as the police, was a violation of human rights because there was “too broad a discretion” given to police officers in implementing the technology. It is argued that despite the multifarious purposes that this technology purports to serve, its use must be regulated.

Read more

Facial Recognition and Data Protection: A Comparative Analysis of laws in India and the EU (Part II)

Posted on April 2, 2021April 3, 2021 by Tech Law Forum NALSAR

[This two-part post has been authored by Riddhi Bang and Prerna Sengupta, second year students at NALSAR University of Law, Hyderabad. Part I can be found here]

Procuring Data from Private Entities

Read more

Data Protection in EdTech Start-ups: An Analysis

Posted on January 8, 2021February 18, 2021 by Tech Law Forum NALSAR

[This post is authored by Oshi Priya, a third-year student at the National Law University of Study and Research in Law, Ranchi.]

Education technology (EdTech) is the means to facilitate e-learning through the combination of software and computer hardware along with educational theory. Though still in its early stages of development, it’s a $700 million industry today in India and is headed for 8-10 times the growth in the next 5 years. Some of the popular EdTech companies in India include Unacademy, BYJU’S, and Toppr, etc.

Read more

How Facial Recognition Systems Threaten the Right to Privacy

Posted on June 27, 2020November 1, 2020 by Tech Law Forum @ NALSAR

[This post has been authored by Prajakta Pradhan, a 1st year student at Dr. Ram Manhar Lohiya National Law University (RMLNLU), Lucknow.]

Facial recognition involves the use of face mapping techniques to identify an individual’s facial features and compares it with available databanks. The facial recognition market is expected to grow to $7.7 billion in 2022 from $4 billion in 2017. The reason for this stellar growth is the varied application of facial recognition technology in both private and public sectors, with governments of many countries using facial recognition for law enforcement and surveillance.

Read more

Standardizing the Data Economy

Posted on October 17, 2019December 13, 2019 by Tech Law Forum @ NALSAR

This piece has been authored by Namratha Murugeshan, a final year student at NALSAR University of Law and member of the Tech Law Forum.

In 2006, Clive Humby, a British mathematician said with incredible foresight that “data is the new oil”. Fast forward to 2019, we see how data has singularly been responsible for big-tech companies getting closer to and surpassing the trillion-dollar net worth mark. The ‘big 4’ tech companies, Google, Apple, Facebook and Amazon have incredibly large reserves of data both in terms of data collection (owing to the sheer number of users each company retains) and in terms of access to data that is collected through this usage. With an increasing number of applications and avenues for data to be used, the requirement of standardizing the data economy manifests itself strongly with more countries recognizing the need to have specific laws concerning data.

What is standardization?

Standards may be defined as technical rules and regulations that ensure the smooth working of an economy. They are required to increase compatibility and interoperability as they set up the framework within which agents must work. With every new technology that is invented the question arises as to how it fits with existing technologies. This question is addressed by standardization. By determining the requirements to be met for safety, quality, interoperability etc., standards establish the molds in which the newer technologies must fit in. Standardization is one of the key reasons for the success of industrialization. Associations of standardization have helped economies function by assuring consumers that the products being purchased meet a certain level of quality. The ISO (International Standards Organization), BIS (Bureau of Indian Standards), SCC (Standards Council of Canada), BSI (British Standards Institute) are examples of highly visible organisations that stamp their seal of approval on products that meet the publicly set level of requirements as per their regulations. There are further standard-setting associations that specifically look into the regulation of safety and usability of certain products, such as food safety, electronics, automobiles etc. These standards are deliberated upon in detail and are based on a discussion with sectoral players, users, the government and other interested parties. Given that they are generally arrived at based on a consensus, the parties involved are in a position to benefit by working within the system.

Standards for the data economy

Currently, the data economy functions without much regulation. Apart from laws on data protection and a few other regulations concerning storage, data itself remains an under-regulated commodity. While multiple jurisdictions are recognizing the need to have laws concerning data usage, collection and storage, it is safe to say that the legal world still needs to catch-up.

In this scenario, standardization provides a useful solution as it seeks to ensure compliance by emphasizing mutual benefit, as opposed to laws which would penalize non-adherence. A market player in the data economy is bound to benefit from standardization as they have readily accessible information regarding the compliance standards for the technology they are creating. By standardizing methods for collection, use, storage and sharing of data the market becomes more open because of increased availability of information, which benefits the players by removing entry barriers. Additionally, a standard-mark pertaining to data collection and usage gives consumers the assurance that the data being shared be used in a safe and quality-tested manner, thereby increasing their trust in the same. Demand and supply tend to match as there is information symmetry in the form of known standards between the supplier and consumer of data.

As per Rational Choice theory an agent in the economy who has access to adequate information (such as an understanding of costs and benefits, existence of alternatives) and who acts on the basis of self-interest, would pick that choice available to them that maximizes their gains. Given this understanding, an agent in the data economy would have higher benefits if there is increased standardization as the same would create avenues to access and usage in the market that is currently heading towards an oligopoly.

How can the data economy be standardized?

The internet has revolutionized the manner in which we share data. It has phenomenally increased the amount of data available on the platform. Anyone who has access to the internet can deploy any sort of data on to the same – be it an app, a website, visual media etc. With internet access coming to be seen as an almost essential commodity, its users and the number of devices connected to the Internet will continue to grow. Big Data remained a buzzword for a good part of this decade (2010’s), and with Big Data getting even bigger, transparency is often compromised as a result. Users are generally unaware of how the data collected from them is stored, used or who has access to it. Although, sometimes terms and conditions concerning certain data and its collection specify these things, it is overlooked more often than not, with the result that users remain in the dark.

There are 3 main areas where standardization would help the data economy –

  1. Data Collection
  2. Data Access
  3. Data Analysis

 

  1. Data Collection – Standardizing the process of data collection has a supply and demand side benefit. On the supply side, the collection of data across various platforms such as social media, personal use devices, networking devices etc., would be streamlined based on the purpose for which they are being harvested. Simpler language of terms and condition, broad specifications of data collection would help the user make an informed choice about whether they want to allow data collection. Thereby, this would seeking permissions from the user by way of categorizing data collection and making the same known to the user. On the demand side, this streamlined data collection would help with accumulating high-quality data as required for specific usage by those collecting it. This would also make for effective compliance with as is required by a significant number of data protection laws across the globe. Purpose limitation is a two-element principle. It says that data must be collected from a user for “explicit, specified and legitimate” purposes only and that data should be processed and used only in a manner that is compatible with the purpose it is collected for. This helps purpose limitation because once data providers are aware of how their data is going to be used, they can make a legitimate claim to check the usage of it by data collectors and seek stricter compliance requirements.

 

  1. Data Access – Standardizing data access would go a long way in breaking down the oligopoly of the 4 big tech companies over data by creating mechanisms for access to the same. As of now, there is no simple method for data sharing across databases and amongst industry players. With monetization of data rising with increasing fervor, access and exchange will be crucial to ensure that the data economy does not stagnate or have exceedingly high barriers to entry. Further, by setting standards for the access to data the stakeholders will be able to participate in discussions regarding the architecture of data access.

 

  1. Data Analytics – This is the domain that remains in the exclusive control of big tech companies. While an increasing number of entities are adopting data analytics, big tech companies have access to enormous amounts of data that has given them a head start. Deep Blue, Alexa, Siri are examples of the outcome of data analytics by IBM, Amazon and Apple respectively. Data analytics is the categorization and processing of data collected and involves putting to use the data resource to achieve the goal of creating newer technologies to cater to the needs of people. Data analytics requires investment that is often significantly beyond the reach of the general population. However, data analytics is extremely important to ensure that the data economy survives. By consistently searching for the next big thing in data analytics, we have seen the advent of Big Data, Artificial Intelligence and Machine Learning (a subset of AI) so far, indicating that investments in data collection and processing pay-off. Further, data analytics has a larger implication on how we tend to work and what aspects of our life we let technology take over. The search for smarter technologies and algorithms will ensure that the data economy thrives and consequently have an impact on the market economy. Standardization of this infrastructure would ensure fairer access norms and usage of collected data.

With the increasing application of processed information to solve our everyday problems, the data economy is currently booming; however, large parts of this economy are controlled by a limited number of players. Standardization in this field would ensure that we move towards increased competition instead of a data oligopoly, ensuring increased competition that will ultimately lead to the faster and healthier growth of the data economy.

Read more

Metadata by TLF: Issue 3

Posted on August 14, 2019December 20, 2020 by Tech Law Forum @ NALSAR

Welcome to our fortnightly newsletter, where our editors put together handpicked stories from the world of tech law! You can find other issues here.

Uber likely to start bus service in India

The San-Francisco cab-aggregator giant, Uber is working on to kick-start an AC bus service in India. With the introduction of AC bus service, Uber is trying to inch closer toward its goals of reducing individual car ownership, expanding transportation access and helping governments plan transportation. Pradeep Parameswaran, Uber India and South Asia head said that “we are in the process of building the product and refining that. Some pilots are live in parts of Latin America and the Middle East. So they are the archetype of markets that would look like India”.

Uber bus will allow commuters to use the Uber app and reserve their seat on an air-conditioned bus. Uber will scan other passengers travelling in the same direction as the rider and hence reaching the destination with fewer stops. Through its bus service, Uber is emphasizing on educational campuses and business centers. Earlier Ola, Uber’s direct competitor, had launched similar kind of bus service in limited cities in 2015 but was stopped in 2018. At present, Gurgaon based Shuttl provides app based bus service to offices. Uber bus service in India is expected to become a reality in mid-2020.

Further Reading:

  1. Moupiya Dutta, Uber will be starting a bus service in India by 2020, TechGenyz (8 August 2019).
  2. Shreya Ganguly, Uber mulls launching bus service in India, Medianama (9 August 2019).
  3. Tenzim Norzom, Ride-hailing major Uber to soon launch bus service in India, Yourstory (7 August 2019).
  4. Hans News Service, Uber to start bus service in India, The Hans India (8 August 2019).
  5. Priyanka Sahay, India may see Uber buses plying on roads in a year, Moneycontrol (8 August 2019).

WhatsApp Hack Can Alter Messages and Spread Misinformation

The Israeli Research Company, Check Point recently revealed that WhatsApp could be hacked causing serious potential security risks to users at the Annual Black Hat Security Conference on 7thAugust, 2019. According to Roman Zaikin and Oded Vanunu, they were able to change the identity of a sender, alter the text of someone’s reply on a group and even send private messages to another member in the group as a public message, such that the reply is visible to all the participants of a group. They were able to exploit the weaknesses of the application, after they reverse-engineered the source code in 2018 and decrypt its traffic. Since then Check Point has stated that it found three ways to manipulate and alter conversations, all of which are exploited through its quoting feature. The creators did warn WhatsApp in 2018 that the tool could be used by ‘threat actors’ to create and spread misinformation and fake news. Facebook has responded stating that the risk is not serious, and to alter the application would mean having to store data about the sender, leading to lesser privacy for its users.

Further Reading:

  1. Davey Winder, WhatsApp Hack Attack Can Change Your Messages, Forbes (7 August 2019).
  2. ET Bureau, WhatsApp hack attack can change your messages, says Israeli security firm, The Economic Times (7 August 2019).
  3. Shreya Ganguly, Messages and identity on WhatsApp can be manipulated if hacked: Check Point Research, Medianama (9 August 2019).
  4. Mike Moore, Hackers can alter WhatsApp chats to show fake information, Tech Radar (9 August 2019).

Facebook’s new entity Calibra raises attention of privacy commissioners

Several privacy commissioners across the world raised concerns over the privacy policy of Facebook’s new Libra digital currency. The countries which have raised concerns are US, UK, EU, Australia, Canada, Albania and Burkina Faso.

Calibra is the new subsidiary of Facebook and its cryptocurrency is called Libra. Calibra hopes to build a financial service on top of the Libra Blockchain. The privacy concerns raised go beyond the question of financial security and privacy because of the expansive collection of data which Facebook accumulates and has access to. Calibra issued a statement that user information will be shared in only certain circumstances but there is no definite understanding of what such situations are. 

Apart from privacy concerns, the joint statement issued by the countries includes several concerns on whether Facebook should be given the right to get involved in the banking sector. If they did, they should seek a new banking charter and should be regulated by all the banking laws. These were few of the concerns raised by privacy commissioners.

Further Reading:

  1. Soumyarendra Barik, Privacy commissioners from across the world raise concerns over Facebook Libra’s privacy risk, Medianama (6 August 2019).
  2. Nick Statt, Facebook’s Calibra is a secret weapon for monetizing its new cryptocurrency, The Verge (18 June 2019).
  3. Reuters, Facebook’s cryptocurrency project raises privacy concerns, asked to halt programme, tech2 (19 June 2019).
  4. Jon Fingas, US, UK regulators ask Facebook how Libra will protect personal data, engagdet (8 May 2019).
  5. Harper Neidig, Global privacy regulators raise concerns over Libra, The Hill (8 May 2019).

EU General Data Protection Regulation exploited to reveal personal data

University of Oxford researcher James Pavur successfully exposed a design flaw in the GDPR, as a bogus demand for data using the “right to access” feature of the regulation saw about one in four companies reveal significant information about the person regarding whom the request was made. Data provided by the companies contained significant information including credit card information, travel details, account passwords and the target’s social security number, which was used by the researcher as evidence of design flaws in the GDPR. Pavur also found that large tech companies did well when it came to evaluating the requests, whereas mid-sized business didn’t perform as well despite being aware of the coming into force of the data protection regulation.

Further Reading:

  1. Leo Kelion, Black Hat: GDPR privacy law exploited to reveal personal data, BBC (8 August 2019).
  2. Sead Fadilpasic, GDPR requests exploited to leak personal data, IT ProPortal (9 August 2019).
  3. John E Dunn, GDPR privacy can be defeated using right of access requests, Naked Security by SOPHOS (12 August 2019).
  4. Understanding the GDPR’s Right of Access, Siteimprov (14 June 2019).

Apple to suspend human review of Siri requests

Human reviewers will no longer be used to study conversations recorded by Siri, according to a recent announcement by Apple. The move gives users a greater degree of privacy over their communications, and analysis of recordings will be suspended while the “grading” system deployed by the company is reviewed. The system refers to the manner in which contractors grade the accuracy of the digital assistant’s voice recognition system, with the primary task being to determine the phrase that triggered action by i.e. whether the user had actually said, “Hey, Siri” or if it was something else.

Further Reading:

  1. Hannah Denham and Jay Greene, Did you say, ‘Hey, Siri’? Apple and Amazon curtail human review of voice recordings., Washington Post (2 August 2019).
  2. Jason Cross, So Apple’s going to stop listening in on your Siri requests. Now what?, Macworld (2 August 2019).
  3. Rob Marvin, Apple to Halt Human Review of Siri Recordings, PC Mag (2 August 2019).
  4. Kate O’Flaherty, Apple Siri Eavesdropping Puts Millions Of Users At Risk, Forbes (28 July 2019).

Read more

Conundrum of Right to Be Forgotten: An Analysis of The Slippery Slope: To Forgive, Forget or Re-Write History

Posted on May 5, 2019May 5, 2019 by Tech Law Forum @ NALSAR

[Ed Note : In a slightly longer read, Pranay Bhattacharya, a second year student of Maharashtra National Law University (MNLU) Aurangabad talks about the origins and development of the “Right to be Forgotten,”, using this as a background to critically analyze this right as present in India’s Draft Personal Data Protection Bill 2018.]

“Blessed are the forgetful, for they get the better even of their blunders.”

Friedrich Nietzsche

Read more

Bare Text Comparison of the Personal Data Protection Bill 2018 with the General Data Protection Rules : Part II – Right to Confirmation and Access

Posted on December 1, 2018November 12, 2019 by Prateek Surisetti

INTRODUCTION TO SERIES

The Personal Data Protection Bill has garnered a fair degree of attention in the last few weeks. For the uninitiated, a brief description of the Bill and its significance can be found here.

The purpose of this series is to analyze the bare text of the Data Principal Rights espoused in the Bill (Chapter VI), namely the Right to Confirmation and Access, Right to Correction, Right to Data Portability and the Right to be Forgotten, in light of the text used in the European legislations to espouse the same values. Each post will deal with each of the above rights.

Part I of the series can be accessed here.

INTRODUCTION TO POST

Over the course of the ensuing section, I shall contrast the text of the Confirmation and Access provisions of the (PDPB) Personal Data Protection Bill (India) (S. 24) with the corresponding provisions of the (GDPR) General Data Protection Regulation (European Union) (Art. 15).

For the purposes of convenience, I have reproduced the relevant provisions below. (Emphasis supplied)

Personal Data Protection Bill (India)

“24. Right to confirmation and access. —

(1) The data principal shall have the right to obtain from the data fiduciary—

(a) confirmation whether the data fiduciary is processing or has processed personal data of the data principal;
(b) a brief summary of the personal data of the data principal being processed or that has been processed by the data fiduciary;
(c) a brief summary of processing activities undertaken by the data fiduciary with respect to the personal data of the data principal, including any information provided in the notice under section 8 in relation to such processing activities.

(2) The data fiduciary shall provide the information as required under this section to the data principal in a clear and concise manner that is easily comprehensible to a reasonable person.…

General Data Protection Regulation (European Union)

“Article 15

Right of access by the data subject

  1. The data subject shall have the right to obtain from the controller confirmation as to whether or not personal data concerning him or her are being processed, and, where that is the case, access to the personal data and the following information:

(a)  the purposes of the processing;

(b)  the categories of personal data concerned;

(c)  the recipients or categories of recipient to whom the personal data have been or will be disclosed, in particular recipients in third countries or international organisations;

(d)  where possible, the envisaged period for which the personal data will be stored, or, if not possible, the criteria used to determine that period;

(e)  the existence of the right to request from the controller rectification or erasure of personal data or restriction of processing of personal data concerning the data subject or to object to such processing;

(f)  the right to lodge a complaint with a supervisory authority;

(g)  where the personal data are not collected from the data subject, any available information as to their source;

(h)  the existence of automated decision-making, including profiling, referred to in Article 22(1) and (4) and, at least in those cases, meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject.

  1. Where personal data are transferred to a third country or to an international organisation, the data subject shall have the right to be informed of the appropriate safeguards pursuant to Article 46 relating to the transfer.
  2. The controller shall provide a copy of the personal data undergoing processing. For any further copies requested by the data subject, the controller may charge a reasonable fee based on administrative costs. Where the data subject makes the request by electronic means, and unless otherwise requested by the data subject, the information shall be provided in a commonly used electronic form.
  3. The right to obtain a copy referred to in paragraph 3 shall not adversely affect the rights and freedoms of others.

ANALYSIS

The right provides “data subjects”/ “data principals” (varying terms used by the GDPR and PDPB respectively for referring to natural persons to whom the data relates to) with the authority to demand from “controllers”/ “data fiduciaries” (varying terms used by the GDPR and PDPB respectively for referring to entities which determine the purpose and means of processing of data), dealing with the data subject’s personal data, certain information pertaining to the personal data. The right ensures that there exists lesser information asymmetry between those to whom the personal data pertains and those who are processing or controlling said data. Refer here for a summary.

At first glance, the Indian draft-legislation’s provision “Right t­­­­­­­o Confirmation and Access” (S. 24) might seem to be rather abstract and vague in comparison to its European counterpart, but closer inspection reveals that both are quite similar. While the GDPR provides guidelines within a mostly self-contained provision, the PDPB’s S. 24 cross-references S. 8, which contains the list of necessary information disclosure obligations placed on the “data fiduciary”.

Though there exists considerable degree of similarity, in text, between both the jurisdictions, certain distinctions in orientations are quite evident from the language of the provisions.

The Indian Bill, admirably, places explicit emphasis on the accessibility of disclosures. S. 24 (2) mandates that the disclosures be “easily comprehensible”. Wherever there exists a power imbalance, those with access to expertise and other resources are better placed to abuse the system through indulging in complex legalities. Such statutory protections reduce the likelihood of resource-rich (access to expertise & infrastructure) “fiduciaries” utilizing complexity to overwhelm citizens incapable of processing technical information.

Furthermore, the Indian draft-legislation requires a “brief summary” (necessarily disclosing the statutorily prescribed information), as opposed to its European counterpart, which doesn’t place any such requirement. The legislative intent behind the same seems to be consistent with the logic of accessibility (prevent provision of information that cannot be processed meaningfully) mentioned above.

Listing the specific data that needs to be disclosed could enable “fiduciaries” to utilize the provision as an avenue to avoid disclosure of other unlisted, but relevant information. I submit that an additional sub-section requiring disclosure of all relevant information over and above the statutorily mandated disclosures (a general overarching clause, in addition to the prescribed disclosure requirements) would have tilted the balance favourably towards data privacy.

Additionally, the Indian Bill doesn’t seem to be placing as much significance on profiling (processing of personal data for analyzing or predicting data subject’s behavior, characteristics, location, etc.; the GDPR’s Art. 4(4) and PDPB’s S 2 (33) define the term in varying detail but essentially, the definitions are of similar import) as its European counterpart. Though the PDPB refers to profiling and allied restrictions across the Bill, it lacks mention in Chapter VI (Data Principal Rights). Even upon analyzing the entirety of the documents, the EU legislation tends to be placing greater restrictions on profiling than PDPB. The Indian Bill, has instead, preferred allowing profiling subject to an assessment (S. 33: “Data Protection Impact Assessment”) carried out by the Data Protection Authority of India (established under Chapter X of the Bill).

Lastly, the European legislation (Art. 20(4)) clarifies that the request for information as a matter of right cannot be in abrogation of other’s “rights and freedoms”. Though S. 27(2) of the PDPB refers to balancing of rights in the context of “Right to Be Forgotten”, S. 24 doesn’t refer to any form of weighing of rights. Given that there could be numerous varied instances of legitimate conflicting rights, allowing the judiciary to decide on a case by case basis seems to point towards prudence.

 

Image taken from here.

Read more

Bare Text Comparison of the Personal Data Protection Bill 2018 with the General Data Protection Rules : Part I – Right to Data Portability

Posted on December 1, 2018November 12, 2019 by Prateek Surisetti

INTRODUCTION TO SERIES

The Personal Data Protection Bill has garnered a fair degree of attention in the last few weeks. For the uninitiated, a brief description of the Bill and its significance can be found here.

The purpose of this series is to analyze the bare text of the Data Principal Rights espoused in the Bill (Chapter VI), namely the Right to Confirmation and Access, Right to Correction, Right to Data Portability and the Right to be Forgotten, in light of the text used in the European legislations to espouse the same values. Each post will deal with each of the above-mentioned rights.

INTRODUCTION TO POST

Over the course of the ensuing section, I shall contrast the text of the Data Portability related provision of the (PDPB) Personal Data Protection Bill (India) (S. 26) with the corresponding provision of the (GDPR) General Data Protection Regulation (European Union) (Art. 20).

For convenience, I have reproduced the relevant provisions below (emphasis supplied) and readers would be benefited from constantly referring to the bare text whenever necessary.

Personal Data Protection Bill (India)

“26. Right to Data Portability. —

(1) The data principal shall have the right to—

(a) receive the following personal data related to the data principal in a structured, commonly used and machine-readable format—

(i) which such data principal has provided to the data fiduciary;
(ii) which has been generated in the course of provision of services or use of goods by the data fiduciary; or
(iii) which forms part of any profile on the data principal, or which the data fiduciary has otherwise obtained.

(b) have the personal data referred to in clause (a) transferred to any other data fiduciary in the format referred to in that clause.

(2) Sub-section (1) shall only apply where the processing has been carried out through automated means, and shall not apply where—

(a) processing is necessary for functions of the State undersection 13;
(b) processing is in compliance of law as referred to in section 14; or
(c) compliance with the request in sub-section (1) would reveal a trade secret of any data fiduciary or would not be technically feasible. ”

General Data Protection Regulation (European Union)

“Article 20

Right to data portability

  1. The data subject shall have the right to receive the personal data concerning him or her, which he or she has provided to a controller, in a structured, commonly used and machine-readable format and have the right to transmit those data to another controller without hindrance from the controller to which the personal data have been provided, where:

(a)  the processing is based on consent pursuant to point (a) of Article 6(1) or point      (a) of Article 9(2) or on a contract pursuant to point (b) of Article 6(1); and

(b)  the processing is carried out by automated means.

  1. In exercising his or her right to data portability pursuant to paragraph 1, the data subject shall have the right to have the personal data transmitted directly from one controller to another, where technically feasible.
  2. The exercise of the right referred to in paragraph 1 of this Article shall be without prejudice to Article 17. That right shall not apply to processing necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller.
  3. The right referred to in paragraph 1 shall not adversely affect the rights and freedoms of others.”

ANALYSIS

Prior to analyzing the distinctions between the legislations, let us briefly understand the concept of data portability and its significance. Data portability requires entities that process data to provide collected personal data in a format that is interoperable across platforms. This prevents entities in control of personal data (e.g. Facebook) to hold one’s personal data hostage to a particular platform by way of avoiding to provide data in a format that can be used elsewhere (e.g. other social media websites).

Firstly, I submit that the exception of technical infeasibility (present in both the GDPR & PDPB) deserves criticism (refer double-underlined portion above). As Lawrence Lessig argued in Code 2.0, technological infeasibility shouldn’t be allowed to override a value system. In order to understand the argument further, let us (a) delve into the statute’s meaning and then (b) analyze the issues with technological infeasibility as an exception to the right.

Both the legislations conceptualize technological feasibility as an exception to the right. Conceptually, they place technology in a position that is conceptually superior to the right itself because the absence of technical feasibility would render the right nugatory. Now, let us move on to (b) analyzing the issues with technological infeasibility as an exception.

There exists a certain value system that our laws espouse. A value system, quite eponymously, is an aggregate of various values and ideals. Once a certain society decides to embrace a particular set of values and ideals i.e. a particular value system (whatever it may constitute), technology shouldn’t be allowed to hinder or steer the furtherance of the said value system. Allowing technological infeasibility to render a right redundant could lead to technological development being divorced from the embraced value system.

The question boils down to whether technology should be allowed to circumscribe the value system, or whether the value system should render the technology invalid? I argue, as did Lawrence Lessig, for the latter. Having the value system render technologies inconsistent with it invalid through law (e.g. by removing technical infeasibility as an exception) would force engineers to develop technologies that are consistent with the value system, which society has chosen. Therefore, such a model orients technological development in the direction of the value system that society has chosen for itself and cherishes, as opposed to a parallel value system. In other words, engineers should have the burden of structuring technology according to the ideals chosen by society, rather than the other way round where society adapts to the values and ideals furthered by the technology developed by engineers (a fraction of society).

Moving on, the EU legislation explicitly clarifies that the right doesn’t exist in abrogation of other’s rights and freedoms (refer underlined portion above). However, the Indian PDPB doesn’t provide any clarification regarding the enforcement of the “data portability” right of an individual vis-a-vis others’ rights and freedoms. Consequently, courts would have to make judgment calls as and when issues, involving a conflict between a “data principal’s” (natural person to whom the personal data relates to; the GDPR uses the term “data subject” instead) right to data portability and other’s rights or freedoms, arise.

Lastly, a difference can be noticed in the manner in which the scope of the right has been framed in either legislation (refer portion in bold above). The PDPB entitles subjects to receive and transfer their personal data solely where the data processing is “automated” (PDPB’s S. 2(7) defines “automated means” as “equipment capable of operation automatically in response to instructions given for the purpose of processing data”). Further narrowing the right’s scope, the PDPB provides exceptions, such as for protection of trade secrets, technical infeasibility, for being in compliance with another law and for furtherance of essential State functions. On the other hand, the GDPR provides the right in instances apart from automated processing, in addition to where the processing is based on consent (Art. 6(1)(a); “consent” defined under Art. 4(11)) and pre-contract obligations (Art. 6(1)(b); means according to a contract). Also, the GDPR provides a slightly different set of exceptions, namely, for technical infeasibility, public interest and exercise of the authority vested in the controller. Additionally, the GDPR provides that the right of data portability cannot prejudice the right to be forgotten (Art. 17). Therefore, it’s quite clear that the legislations are substantially different in demarcating the scope of the data portability right, but an analysis of greater rigour is required for ascertaining the actual ambit of the provisions.

 

Image taken from here.

Read more

Subscribe

Recent Posts

  • Mapping the rise of the surveillance state amid the COVID-19 crisis
  • Facial Recognition and Data Protection: A Comparative Analysis of laws in India and the EU (Part I)
  • Facial Recognition and Data Protection: A Comparative Analysis of laws in India and the EU (Part II)
  • The Internet and Marginalised Genders: A Comment in view of the Intermediary Guidelines, 2021
  • Metadata by TLF: Issue 20
  • Data Exploitation and Discrimination Through “Empowering” Femtech Apps
  • Facebook and its Oversight Board: Regulatory Attempts in an Impractical Relationship
  • Deciphering the Relationship Between IoT and Its Users (Part II)
  • Deciphering the Relationship Between IoT and Its Users (Part I)
  • News Publishers and the Claim for Remuneration: An Analysis (Part II)

Categories

  • 101s
  • 3D Printing
  • Aadhar
  • Account Aggregators
  • Antitrust
  • Artificial Intelligence
  • Bitcoins
  • Blockchain
  • Blog Series
  • Bots
  • Broadcasting
  • Censorship
  • Collaboration with r – TLP
  • Convergence
  • Copyright
  • Criminal Law
  • Cryptocurrency
  • Data Protection
  • Digital Piracy
  • E-Commerce
  • Editors' Picks
  • Evidence
  • Feminist Perspectives
  • Finance
  • Freedom of Speech
  • GDPR
  • Intellectual Property
  • Intermediary Liability
  • Internet Broadcasting
  • Internet Freedoms
  • Internet Governance
  • Internet Jurisdiction
  • Internet of Things
  • Internet Security
  • Internet Shutdowns
  • Labour
  • Licensing
  • Media Law
  • Medical Research
  • Network Neutrality
  • Newsletter
  • Open Access
  • Open Source
  • Others
  • OTT
  • Personal Data Protection Bill
  • Press Notes
  • Privacy
  • Recent News
  • Regulation
  • Right to be Forgotten
  • Right to Privacy
  • Right to Privacy
  • Social Media
  • Surveillance
  • Taxation
  • Technology
  • TLF Ed Board Test 2018-2019
  • TLF Editorial Board Test 2016
  • TLF Editorial Board Test 2019-2020
  • TLF Editorial Board Test 2020-2021
  • TLF Explainers
  • TLF Updates
  • Uncategorized
  • Virtual Reality

Tags

AI Amazon Antitrust Artificial Intelligence Chilling Effect Comparative Competition Copyright copyright act Criminal Law Cryptocurrency data data protection Data Retention e-commerce European Union Facebook facial recognition financial information Freedom of Speech Google India Intellectual Property Intermediaries Intermediary Liability internet Internet Regulation Internet Rights IPR Media Law News Newsletter OTT Privacy RBI Regulation Right to Privacy Social Media Surveillance technology The Future of Tech TRAI Twitter Uber WhatsApp

Meta

  • Log in
  • Entries feed
  • Comments feed
  • WordPress.org
© 2021 Tech Law Forum @ NALSAR | Powered by Minimalist Blog WordPress Theme