Skip to content

Tech Law Forum @ NALSAR

A student-run group at NALSAR University of Law

Menu
  • Home
  • Newsletter Archives
  • Blog Series
  • Editors’ Picks
  • Write for us!
  • About Us
Menu

Category: Right to Privacy

Data Protection in EdTech Start-ups: An Analysis

Posted on January 8, 2021January 19, 2024 by Tech Law Forum NALSAR

[This post is authored by Oshi Priya, a third-year student at the National Law University of Study and Research in Law, Ranchi.]

Education technology (EdTech) is the means to facilitate e-learning through the combination of software and computer hardware along with educational theory. Though still in its early stages of development, it’s a $700 million industry today in India and is headed for 8-10 times the growth in the next 5 years. Some of the popular EdTech companies in India include Unacademy, BYJU’S, and Toppr, etc.

Read more

Metadata by TLF: Issue 15

Posted on July 20, 2020December 20, 2020 by Tech Law Forum @ NALSAR

Welcome to our fortnightly newsletter, where our reporters Kruttika Lokesh and Dhananjay Dhonchak put together handpicked stories from the world of tech law! You can find other issues here.

PIL filed seeking identities of content moderation officers

Former RSS ideologue K N Govindacharya filed a public-interest litigation in the High Court of Delhi to prompt Google, Twitter and Facebook to disclose identities of designated content moderation officers on the basis of the Information Technology Rules. In response, Google submitted that the officers worked with government authorities to remove illegal content. Govindacharya claimed that without disclosure of the officers’ identities, no mechanisms to enforce obligations could not be adequately instituted. However, Google responded by stating that revealing the identities of officers would jeopardize their capacity to work efficiently with the government, as they would be exposed to public scrutiny and criticism.

Read more

The Conundrum of Compelled Decryption Vis-À-Vis Self-Incrimination

Posted on July 20, 2020November 1, 2020 by Tech Law Forum @ NALSAR

[This post has been authored by Shivang Tandon, a fourth year student at Faculty of Law, Banaras Hindu University.]

The ‘self-incrimination’ doctrine is an indispensable part of the criminal law jurisprudence of a civilized nation. Article 20(3) of the Indian Constitution and the Fifth Amendment of the Constitution of the United States provide protection against self-incrimination.

Read more

How Facial Recognition Systems Threaten the Right to Privacy

Posted on June 27, 2020November 1, 2020 by Tech Law Forum @ NALSAR

[This post has been authored by Prajakta Pradhan, a 1st year student at Dr. Ram Manhar Lohiya National Law University (RMLNLU), Lucknow.]

Facial recognition involves the use of face mapping techniques to identify an individual’s facial features and compares it with available databanks. The facial recognition market is expected to grow to $7.7 billion in 2022 from $4 billion in 2017. The reason for this stellar growth is the varied application of facial recognition technology in both private and public sectors, with governments of many countries using facial recognition for law enforcement and surveillance.

Read more

Metadata by TLF: Issue 9

Posted on May 9, 2020December 20, 2020 by Tech Law Forum @ NALSAR

Welcome to our fortnightly newsletter, where our reporters Kruttika Lokesh and Dhananjay Dhonchak put together handpicked stories from the world of tech law! You can find other issues here.

Zoom sued by shareholder for ‘overstating’ security claims

Read more

Standardizing the Data Economy

Posted on October 17, 2019December 13, 2019 by Tech Law Forum @ NALSAR

This piece has been authored by Namratha Murugeshan, a final year student at NALSAR University of Law and member of the Tech Law Forum.

In 2006, Clive Humby, a British mathematician said with incredible foresight that “data is the new oil”. Fast forward to 2019, we see how data has singularly been responsible for big-tech companies getting closer to and surpassing the trillion-dollar net worth mark. The ‘big 4’ tech companies, Google, Apple, Facebook and Amazon have incredibly large reserves of data both in terms of data collection (owing to the sheer number of users each company retains) and in terms of access to data that is collected through this usage. With an increasing number of applications and avenues for data to be used, the requirement of standardizing the data economy manifests itself strongly with more countries recognizing the need to have specific laws concerning data.

What is standardization?

Standards may be defined as technical rules and regulations that ensure the smooth working of an economy. They are required to increase compatibility and interoperability as they set up the framework within which agents must work. With every new technology that is invented the question arises as to how it fits with existing technologies. This question is addressed by standardization. By determining the requirements to be met for safety, quality, interoperability etc., standards establish the molds in which the newer technologies must fit in. Standardization is one of the key reasons for the success of industrialization. Associations of standardization have helped economies function by assuring consumers that the products being purchased meet a certain level of quality. The ISO (International Standards Organization), BIS (Bureau of Indian Standards), SCC (Standards Council of Canada), BSI (British Standards Institute) are examples of highly visible organisations that stamp their seal of approval on products that meet the publicly set level of requirements as per their regulations. There are further standard-setting associations that specifically look into the regulation of safety and usability of certain products, such as food safety, electronics, automobiles etc. These standards are deliberated upon in detail and are based on a discussion with sectoral players, users, the government and other interested parties. Given that they are generally arrived at based on a consensus, the parties involved are in a position to benefit by working within the system.

Standards for the data economy

Currently, the data economy functions without much regulation. Apart from laws on data protection and a few other regulations concerning storage, data itself remains an under-regulated commodity. While multiple jurisdictions are recognizing the need to have laws concerning data usage, collection and storage, it is safe to say that the legal world still needs to catch-up.

In this scenario, standardization provides a useful solution as it seeks to ensure compliance by emphasizing mutual benefit, as opposed to laws which would penalize non-adherence. A market player in the data economy is bound to benefit from standardization as they have readily accessible information regarding the compliance standards for the technology they are creating. By standardizing methods for collection, use, storage and sharing of data the market becomes more open because of increased availability of information, which benefits the players by removing entry barriers. Additionally, a standard-mark pertaining to data collection and usage gives consumers the assurance that the data being shared be used in a safe and quality-tested manner, thereby increasing their trust in the same. Demand and supply tend to match as there is information symmetry in the form of known standards between the supplier and consumer of data.

As per Rational Choice theory an agent in the economy who has access to adequate information (such as an understanding of costs and benefits, existence of alternatives) and who acts on the basis of self-interest, would pick that choice available to them that maximizes their gains. Given this understanding, an agent in the data economy would have higher benefits if there is increased standardization as the same would create avenues to access and usage in the market that is currently heading towards an oligopoly.

How can the data economy be standardized?

The internet has revolutionized the manner in which we share data. It has phenomenally increased the amount of data available on the platform. Anyone who has access to the internet can deploy any sort of data on to the same – be it an app, a website, visual media etc. With internet access coming to be seen as an almost essential commodity, its users and the number of devices connected to the Internet will continue to grow. Big Data remained a buzzword for a good part of this decade (2010’s), and with Big Data getting even bigger, transparency is often compromised as a result. Users are generally unaware of how the data collected from them is stored, used or who has access to it. Although, sometimes terms and conditions concerning certain data and its collection specify these things, it is overlooked more often than not, with the result that users remain in the dark.

There are 3 main areas where standardization would help the data economy –

  1. Data Collection
  2. Data Access
  3. Data Analysis

 

  1. Data Collection – Standardizing the process of data collection has a supply and demand side benefit. On the supply side, the collection of data across various platforms such as social media, personal use devices, networking devices etc., would be streamlined based on the purpose for which they are being harvested. Simpler language of terms and condition, broad specifications of data collection would help the user make an informed choice about whether they want to allow data collection. Thereby, this would seeking permissions from the user by way of categorizing data collection and making the same known to the user. On the demand side, this streamlined data collection would help with accumulating high-quality data as required for specific usage by those collecting it. This would also make for effective compliance with as is required by a significant number of data protection laws across the globe. Purpose limitation is a two-element principle. It says that data must be collected from a user for “explicit, specified and legitimate” purposes only and that data should be processed and used only in a manner that is compatible with the purpose it is collected for. This helps purpose limitation because once data providers are aware of how their data is going to be used, they can make a legitimate claim to check the usage of it by data collectors and seek stricter compliance requirements.

 

  1. Data Access – Standardizing data access would go a long way in breaking down the oligopoly of the 4 big tech companies over data by creating mechanisms for access to the same. As of now, there is no simple method for data sharing across databases and amongst industry players. With monetization of data rising with increasing fervor, access and exchange will be crucial to ensure that the data economy does not stagnate or have exceedingly high barriers to entry. Further, by setting standards for the access to data the stakeholders will be able to participate in discussions regarding the architecture of data access.

 

  1. Data Analytics – This is the domain that remains in the exclusive control of big tech companies. While an increasing number of entities are adopting data analytics, big tech companies have access to enormous amounts of data that has given them a head start. Deep Blue, Alexa, Siri are examples of the outcome of data analytics by IBM, Amazon and Apple respectively. Data analytics is the categorization and processing of data collected and involves putting to use the data resource to achieve the goal of creating newer technologies to cater to the needs of people. Data analytics requires investment that is often significantly beyond the reach of the general population. However, data analytics is extremely important to ensure that the data economy survives. By consistently searching for the next big thing in data analytics, we have seen the advent of Big Data, Artificial Intelligence and Machine Learning (a subset of AI) so far, indicating that investments in data collection and processing pay-off. Further, data analytics has a larger implication on how we tend to work and what aspects of our life we let technology take over. The search for smarter technologies and algorithms will ensure that the data economy thrives and consequently have an impact on the market economy. Standardization of this infrastructure would ensure fairer access norms and usage of collected data.

With the increasing application of processed information to solve our everyday problems, the data economy is currently booming; however, large parts of this economy are controlled by a limited number of players. Standardization in this field would ensure that we move towards increased competition instead of a data oligopoly, ensuring increased competition that will ultimately lead to the faster and healthier growth of the data economy.

Read more

Metadata by TLF: Issue 6

Posted on October 10, 2019December 20, 2020 by Tech Law Forum @ NALSAR

Welcome to our fortnightly newsletter, where our Editors put together handpicked stories from the world of tech law! You can find other issues here.

Delhi HC orders social media platforms to take down sexual harassment allegations against artist

The Delhi High Court ordered Facebook, Google and Instagram to remove search result, posts and any content containing allegations of sexual harassment against artist Subodh Gupta. These include blocking/removal of social media posts, articles and Google Search result links. The allegations were made about a year ago, by an unknown co-worker of Gupta on an anonymous Instagram account ‘Herdsceneand’. These allegations were also posted on Facebook and circulated by news reporting agencies. An aggrieved Subodh Gupta then filed a civil defamation suit, stating these allegations to be false and malicious. Noting the seriousness of the allegations, the Court passed an ex-parte order asking the Instagram account holder, Instagram, Facebook and Google to take down this content. The Court has now directed Facebook to produce the identity of the person behind the account ‘Herdsceneand’ in a sealed cover. 

Further Reading:

  1. Trisha Jalan, Right to be Forgotten: Delhi HC orders Google, Facebook to remove sexual harassment allegations against Subodh Gupta from search results, Medianama (1 October 2019).
  2. Akshita Saxen, Delhi HC Orders Facebook, Google To Take Down Posts Alleging Sexual Harassment by Artist Subodh Gupta [Read Order], LiveLaw.in (30 September 2019).
  3. Aditi Singh, Delhi HC now directs Facebook to reveal identity of person behind anonymous sexual harassment allegations against Subodh Gupta,  Bar & Bench (10 October 2019).
  4. The Wire Staff, Subodh Gupta Files Rs. 5-Crore Defamation Suit Against Anonymous Instagram Account, The Wire (1 October 2019)
  5. Dhananjay Mahapatra, ‘MeToo’ can’t become a ‘sullying you too’ campaign: Delhi HC, Times of India (17 May 2019).
  6. Devika Agarwal, What Does ‘Right to be Forgotten’ Mean in the Context of the #MeToo Campaign, Firstpost (19 June 2019).

Petition filed in Kerala High Court seeking a ban on ‘Telegram’

A student from National Law School of India, Bengaluru filed a petition in the Kerala high court seeking a ban on the mobile application – Telegram. The reason cited for this petition is that the app has no  checks and balances in place. There is no government regulation, no office in place and the lack of encryption keys ensures that the person sending the message can not be traced back. It was only in June this year that telegram refused to hand over the chat details of the ISIS module to the National Investigation Agency.  As compared to apps such as Watsapp, Telegram has a greater degree of secrecy. One of the features Telegram boasts of is the ‘secret chat’ version which notifies users if someone has taken a screenshot, disables the user from forwarding of messages etc. Further, there are fewer limits on the number of people who can join a channel and this makes moderation on the dissemination of information even more difficult. It is for this reason that telegram is dubbed as the ‘app of choice’ for many terrorists. It is also claimed that the app is used for transmitting vulgar and obscene content including child pornography. Several countries such as Russia and Indonesia have banned this app due to safety concerns. 

Further Reading:

  1. Soumya Tiwari, Petition in Kerala High Court seeks ban on Telegram, cites terrorism and child porn, Medianama (7 October 2019).
  2. Brenna Smith, Why India Should Worry About the Telegram App, Human Rights Centre (17 February 2019).
  3. Benjamin M., Why Are So Many Countries Banning Telegram?, Dogtown Media (11 May 2019).
  4. Vlad Savov, Russia’s Telegram ban is a big convoluted mess, The Verge (17 April 2018).
  5. Megha Mandavia, Kerala High Court seeks Centre’s views on plea to ban Telegram app, The Economic Times (4 October 2019). 
  6. Livelaw News Network, Telegram Promotes Child Pornography, Terrorism’ : Plea In Kerala HC Seeks Ban On Messaging App, Livelaw.in (2 October 2019).

ECJ rules that Facebook can be ordered to take down content globally

In a significant ruling, the European Court of Justice ruled that Facebook can be ordered to take down posts globally, and not just in the country that makes the request. It extends the reach of the EU’s internet-related laws beyond its own borders, and the decision cannot be appealed further. The ruling stemmed from a case involving defamatory comments posted on the platform about an Austrian politician, following which she demanded that Facebook erase the original comments worldwide and not just from the Austrian version worldwide. The decision raises the question of jurisdiction of EU laws, especially at a time when countries are outside the bloc are passing their own laws regulating the matter.

Further Reading:

  1. Adam Satariano, Facebook Can Be Forced to Delete Content Worldwide, E.U.’s Top Court Rules, The New York Times (3 October 2019).
  2. Chris Fox, Facebook can be ordered to remove posts worldwide, BBC News (3 October 2019).
  3. Makena Kelly, Facebook can be forced to remove content internationally, top EU court rules, The Verge (3 October 2019).
  4. Facebook must delete defamatory content worldwide if asked, DW (3 October 2019).

USA and Japan sign Digital Trade Agreement

The Digital Trade Agreement was signed by USA and Japan on October 7, 2019. The Agreement is an articulation of both the nations’ stance against data localization. The trade agreement cemented a cross-border data flow. Additionally, it allowed for open access to government data through Article 20. Articles 12 and 13 ensures no restrictions of electronic data across borders. Further, Article 7 ensures that there are no customs on digital products which are electronically transmitted. Neither country’s parties can be forced to share the source code while sharing the software during sale, distribution, etc. The first formal articulation of the free flow of digital information was seen in the Data Free Flow with Trust (DFFT), which was a key feature of the Osaka Declaration on Digital Economy. The agreement is in furtherance of the Trump administration’s to cement America’s standing as being tech-friendly, at a time when most other countries are introducing reforms to curb the practices of internet giants like Google and Facebook, and protect the rights of the consumers. American rules, such as Section 230 of the Communications Decency Act shields companies from any lawsuits related to content moderation. America, presently appears to hope that their permissive and liberal laws will become the framework for international laws. 

Further Reading:

  1.     Aditi Agarwal, USA, Japan sign Digital Trade Agreement, stand against data localisation, Medianama (9 October 2019).
  2.     U.S.-Japan Digital Trade Agreement Text, Office of the United States Trade Representative (7 October 2019).
  3.   Paul Wiseman, US signs limited deal with Japan on ag, digital trade,Washington Post (8 October 2019).
  4.   FACT SHEET U.S.-Japan Digital Trade Agreement, Office of the United States Trade Representative (7 October 2019).
  5. David McCabe and Ana Swanson, U.S. Using Trade Deals to Shield Tech Giants From Foreign Regulators, The New York Times (7 October 2019).

Read more

Compelled to Speak: The Right to Remain Silent (Part II)

Posted on September 13, 2019September 13, 2019 by Tech Law Forum @ NALSAR

This is the second part of a two-part post by Benjamin Vanlalvena, a final year law student at NALSAR University of Law. In this post, he critiques a recent judgement by the Supreme Court which allowed Magistrates to direct an accused to give voice samples during investigation, without his consent. Part 1 can be found here.

Judicial discipline and the doctrine of imminent necessity

In the previous part, I dealt with the certain privacy concerns that may arise with respect to voice sampling and how various jurisdictions have approached the same. In this part, I will be critiquing the manner in which the Supreme Court in Ritesh Sinha has imparted legislative power onto itself, is by the terming the absence of legislative authorization for voice sampling of accused persons as a procedural anomaly, and extending its power in filling such assumed voids by invoking not only the principle of ejusdem generis, but also citing the “principle of imminent necessity”.

This strangely arises since reference is made to Ram Babu Misra, where it had earlier looked into Section 73 of the Indian Evidence Act, 1872 and whether the same afforded the Magistrate the power to direct the accused to give her specimen writing even during the course of investigation. In absence of such a provision, such powers were denied. Subsequently, section 311A (vide Code of Criminal Procedure (Amendment) Act, 2005 later afforded the Magistrate the power to direct any person to submit specimen signatures or handwriting. In this regard, the Supreme Court in Sukh Ram, held that the powers provided by the Amendment were prospective and not retrospective in nature and therefore such direction was impermissible since they were not provided for.

In the present case, the Supreme Court notes that “procedure is the handmaid, not the mistress, of justice and cannot be permitted to thwart the fact-finding course in litigation”. This is prima facie problematic given the relevance of the maxim in civil matters in resolving dilemmas by by-passing procedure in the interest of justice. In criminal matters, the State holds an instrument of enquiry against the accused, with the balance of powers weighing heavily against the individual. The jurisprudential trend of privileging crime control interests and merely opposing oppression or coercion in cases which would affect the reliability of the evidence, has thus continued. It would be relevant to look at the right against self-incrimination, explored by Abhinav Sekhri in his article ‘The right against self-incrimination in India: the compelling case of Kathi Kalu Oghad’, to be one that had originally arisen as a protection against the State by placing procedural safeguards and substantive remedies.

In this case, the Court refers to Puttaswamy, to hold that the right to privacy must “bow down to compelling public interest”. However, in Puttaswamy, Justice Chandrachud had cited A K Roy vs Union of India whereby, the Constitution Bench of the Supreme Court recognised that “…[p]rocedural safeguards are the handmaids of equal justice and …, [that] the power of the government is colossal as compared with the power of an individual…”, (emphasis mine) that preventive detention finds its basis in law, and thus is permissible under the Constitution.

Indeed, Maneka’s reference to R.C. Cooper in understanding permissible restrictions of personal liberty is of assistance, noting that abrogation of the rights of individuals must fulfil the tests of reasonableness. Irrespective of whether the demand of an individual’s voice sample is a permissible violation vide the individual’s right to privacy guaranteed under the Constitution, the order itself must find a basis in law. Alas, the same cannot be said for the present matter.

As this is a policy decision, entrusted to the State, it is curious to see how Courts have time and again found justification in intruding the halls of the Legislature. The same was also recognised in the Puttaswamy judgment where deference to the wisdom of law enacting or law enforcing bodies was sought. Silence postulates a realm of privacy, wrote Justice Chandrachud. While the same is not an absolute right, it is for the Courts to protect the individual from the State’s powers, to adjudge whether the laws and actions consist of legitimate aims of the State, and not for the Courts to provide power became an arm of the State itself. The part of the Kharak Singh judgment which was upheld, had recognised the importance of the existence of a “law” to term something as either constitutional or unconstitutional, and thus termed the relevant regulation as unconstitutional.

Presently, it is the Court which has taken on such a burden to create the law encroaching on the accused’s rights. This is even after alluding to the Legislature’s possible choice to be “oblivious and despite express reminders chooses not to include voice sample”, and only provide for a few tests (though in Selvi, the Court recognised the impropriety and impracticality to look into Legislative intent given the lack of “access to all the materials which would have been considered by the Parliament”).

Curiously, in affording the Judicial Magistrate the power to order voice sampling for “the purpose of investigation into a crime”, there is ambiguity at what stage this power can be invoked, the manner in which it can be invoked, and who can invoke the same. Ordinarily, medical examinations under 53/53A/54 of the Cr PC have been read to be done at the instance of “the investigating officer or even the arrested person himself…[or] at the direction of the jurisdictional court.” We may also look at Section 53 of the Cr PC, as per which medical examination can occur only when there is sufficient material on record to justify the same, and is impermissible otherwise.

Finally, the Court has not only failed to illustrate the existence of an imminent necessity, to make such an alteration or confer such a power, it has failed to explain in what context can Courts invoke such a maxim and has not developed the same in detail. One might note, that the principle of necessity is one generally afforded to individuals in cases of private defence or in cases of emergencies, excusing individuals from acts that would ordinarily make them liable of certain crimes. Curiously, there is no mention of an affidavit from the side of the police administration, no studies have been cited. Mere legislative delay as a justification for imminent necessity in light of certain advancements does not seem sound.

In light of the same, given Navtej, NALSA, and Puttaswamy, and the failure of the Legislature to amend at least the Special Marriage Act to recognize the rights of LGBTQI individuals to marry, and be with their individual of choice, should not the same have also provided for? Can the same be taken as a justification to abrogate digital privacy rights in the world of evolving technologies, by mandating backdoors? At what stage does Legislature’s refusal also amount to Legislature’s lax? Does this apply only for social developments or technological developments? If the Legislature was in fact, aware of voice exemplars (as has been observed), and chose not to incorporate the same into the relevant sections and clauses, can the same be read as legislative delay or refusal? Whether this aspect of the judgment, invoking “imminent necessity”, will be read into to provide justification for some other transformation is yet to be seen.

Conclusion

The Court had a path available to it through Selvi and indeed Justice Desai, had charted through the same invoking precedents which permitted such a reading. However, the Court in this reference judgment seems to have (unnecessarily) gone the extra mile by mention of this principle of imminent necessity. Whereas the former is a matter of difference in opinion, the latter is a clear bypass of the Legislature’s powers at the Court’s own pleasure. We may take heed to Justice H.R. Khanna’s dissent, in the ADM Jabalpur case, that when the means don’t matter, when procedure is no longer insisted upon, the ends can only lead us to arbitrariness, a place devoid of personal liberty.

I conclude by noting Lord Camden’s dictum in Entick vs Carrington (which we would now find through our Article 21 protection: “No person shall be deprived of his life or personal liberty except according to procedure established by law” (emphasis mine) (also read into the right against self-incrimination through Selvi):

If it is law, it will be found in our books. If it is not to be found there, it is not law.

 

Click here for Part II.

Read more

Compelled to Speak: The Right to Remain Silent (Part I)

Posted on September 13, 2019September 13, 2019 by Tech Law Forum @ NALSAR

This is the first part of a two-part post by Benjamin Vanlalvena, a final year law student at NALSAR University of Law. In this post, he critiques a recent judgement by the Supreme Court which allowed Magistrates to direct an accused to give voice samples during investigation, without his consent. Part II can be found here.

Nearly threescore ago, in Kathi Kalu Oghad, the eleven judge-bench of the Supreme Court of India decided on the question of the extent of constitutional protections against self-incrimination (vide Article 20(3)). The Supreme Court therein deviated from the notion of self-incrimination being one inclusive of “every positive volitional act which furnishes evidence” laid down in M.P. Sharma, and recognised a distinction between “to be a witness” and “to furnish evidence”. The present judgment arose on a difference in opinion in the division bench of the Supreme Court in Ritesh Sinha, regarding the permissibility of ordering an accused to provide their voice sample. In this part, I will talk about voice sampling and its interactions with privacy, and look at how different jurisdictions have looked at voice spectography – whether the same would be violative of the individual’s right to privacy and their right against self-incrimination. Finally, I will make a short point on technological developments and their interaction with criminal law. In the next part I will be dealing with the Court’s failure to simply rely upon Selvi to expand the definition, and instead how it created the doctrine of “imminent necessity” (a principle generally present in criminal law for private defence!) to justify the Court’s intervention into the halls of the Legislature in light of “contemporaneous realities/existing realities on the ground”.

Facts

The Investigating Authority seized the mobile phone from Dhoom Singh, allegedly in association with the accused-appellant Ritesh Sinha, and wanted to verify whether the recorded conversation was between both the individuals and thus needed the voice sample of the appellant to verify the same. Accordingly, summons was issued, and the present appellant was ordered to give his voice sample. This was subsequently challenged before the Delhi High Court who negatived his challenge. Aggrieved by the same, an appeal was filed before the Supreme Court, as a result of split verdict, the same was referred to a larger bench. The opinions by Justice Desai and Justice Aslam in the division bench have been sufficiently explored earlier by Gautam Bhatia and Abhinav Sekhri. Therein, both Justices were of one mind on voice sampling not being violative of the right against self-incrimination, with differences on the permissibility of voice sampling, considering an absence of an explicit provision permitting the same.

Voice Sampling and Privacy

In this reference judgment, Chief Justice Ranjan Gogoi traces the history of rights against self-incrimination by referencing (then) Chief Justice B.P. Sinha’s observations that documents which by themselves do not incriminate but are “only materials for comparison in order to lend assurance to the Court that its inference based on other pieces of evidence is reliable” and would not be violative of Article 20(3).

Recognising the limitation under section 53 and section 53A of the Code of Criminal Procedure, 1973, reference is made to the 87th Law Commission Report which suggested that an amendment to the Identification of Prisoners Act, 1920 to specifically empower a Judicial Magistrate to compel an accused person to give a voice print. No such action has been taken in that regard.

In Selvi, ‘personal liberty’ in the context of self-incrimination, was understood as being one whereby involuntariness is avoided, summing up this right to three points: (1) preventing custodial violence, and other third-degree methods to protect the dignity and bodily integrity of the person being examined, to serve as “a check on police behaviour during the course of investigation”. (2) To put the onus of proof on the prosecution, and (3) to ensure reliability of evidence, that involuntary statements could result in misleading “the judge and the prosecutor… resulting in a miscarriage of justice …[with] false statements …likely to cause delays and obstructions in the investigation efforts”. The third point is consistent with the majority view in Kathi Kalu Oghad, which found “specimen handwriting or signature or finger impressions by themselves…[to not be testimony since they are] wholly innocuous because they are unchangeable…[that they] are only materials for comparison in order to lend assurance to the Court that its inference based on other pieces of evidence is reliable.” While there was a hesitation to read everything under the sun as “such other tests” in Selvi, it was recognised that that through an invocation of ejusdem generis, the same could be extended to other physical examinations, but not other examinations which involve testimonial acts. In this regard, we may consider Gautam Bhatia’s analysis of Selvi which digs deep into this issue. As an aside, beyond the question of the content of either the “said” or the “statement” itself, it would be of assistance to also look at the nature of police systems, whereby even in a post-Miranda setting in the US, the reality and nature of voluntariness is suspect.

The position of viewing exemplars by themselves to not be statements is consistent with various courts. That is, handwriting, signature, etc., existing within, or from the individual, the individual is not considered to have been made to give that which cannot otherwise be seen since the evidence is not altered irrespective of compulsion to give the same.

In Levack, the Supreme Court of Appeal in South Africa held that sound (and consequently voice exemplars), firstly, could be considered as a ‘distinguishing feature’ under Section 37(1)(c) of the Criminal Procedure Act of 1977. Secondly, that voice exemplars being ‘autoptic evidence’, derived from the accused’s own bodily features could be distinguished as not being testimonial or communicative in nature.

This echoes the view taken by the Supreme Court of the United States (SCOTUS) in the case of Dionisio, recognizing that voice samples (exemplars), for the purposes of identification, as not being violative of the individual’s rights against self-incrimination enshrined under the Fourth and Fifth Amendments. Since they are were mere physical characteristics, being attained as mere identifiers and not for their testimonial or communicative content (See also Gilbert and Wade). Further, relying on Katz, where it held that Fourth Amendment protections would not be offered “for what ‘a person knowingly exposes to the public…’”. Therefore, “[n]o person can have a reasonable expectation that others will not know the sound of his voice, any more than he can reasonably expect that his face will be a mystery to the world.”.

In Jalloh vs. Germany, the Strasbourg Court observed that the right against self-incrimination guaranteed under Article 6(1) would not extend to material obtained through the use of compulsory powers from the accused person which have an “existence independent of the will of the suspect such as, inter alia, documents acquired pursuant to a warrant, breath, blood, urine, hair or voice samples and bodily tissue for the purpose of DNA testing”. (emphasis mine).

The Pacing Problem

The failure of legal systems to consider technological changes which may assist in collection of evidence or other crime control uses is termed as a ‘pacing problem’, and is comprised of two dimensions – Firstly, the basis of existing legal frameworks on a static rather than dynamic view of society and technology. Secondly, the slowing down of legal institutions with respect to their capacity to adjust to changing technologies.

The Legislature’s failure to provide for handwriting samples for two decades even after the Supreme Court and Law Commission’s mention of the same has been noted by Abhinav Sekhri. Admittedly, the benefits of voice sampling for identification are evident, and have even been used before. However, this judgment fails to clarify under which section such power has been conferred. If the same were to exist under the Identification of Prisoners Act, there may be some semblance of relief through section 7, which mandates destruction or handing over of such measurements and photographs to individuals in certain cases.

The DNA Bill, as introduced in the Lok Sabha allows for removal of DNA collected on certain conditions (vide Section 31(2)-(3), however, even then, it is one that occurs only on police report, order of the court or a written request (method varying on the basis of the incident), contrary to other jurisdictions or even section 7 of the Identification of Prisoners Act, the status quo is thus of retainment, and not automatic removal.

In trying to keep up with technological advancements, the Court has thus failed to recognise the importance of procedure in criminal matters and instead produced procedural uncertainty; it is even more curious to note that Selvi which would have been sufficient justification was not invoked even once in this case.

 

Click here for Part II.

Read more

Metadata by TLF: Issue 4

Posted on September 10, 2019December 20, 2020 by Tech Law Forum @ NALSAR

Welcome to our fortnightly newsletter, where our Editors put together handpicked stories from the world of tech law! You can find other issues here.

Facebook approaches SC in ‘Social Media-Aadhaar linking case’

In 2018, Anthony Clement Rubin and Janani Krishnamurthy filed PILs before the Madras High Court, seeking a writ of Mandamus to “declare the linking of Aadhaar of any one of the Government authorized identity proof as mandatory for the purpose of authentication while obtaining any email or user account.” The main concern of the petitioners was traceability of social media users, which would be facilitated by linking their social media accounts with a government identity proof; this in turn could help combat cybercrime. The case was heard by a division bench of the Madras HC, and the scope was expanded to include curbing of cybercrime with the help of online intermediaries. In June 2019, the Internet Freedom Foundation became an intervener in the case to provide expertise in the areas of technology, policy, law and privacy. Notably, Madras HC dismissed the prayer asking for linkage of social media and Aadhaar, stating that it violated the SC judgement on Aadhaar which held that Aadhaar is to be used only for social welfare schemes. 

Facebook later filed a petition before the SC to transfer the case to the Supreme Court. Currently, the hearing before the SC has been deferred to 13 September 2019 and the proceedings at the Madras HC will continue. Multiple news sources reported that the TN government, represented by the Attorney General of India K.K. Venugopal, argued for linking social media accounts and Aadhaar before the SC. However, Medianama has reported that the same is not being considered at the moment and the Madras HC has categorically denied it.

Further Reading:

  1. Aditi Agrawal, SC on Facebook transfer petition: Madras HC hearing to go on, next hearing on September 13, Medianama (21 August 2019).
  2. Nikhil Pahwa, Against Facebook-Aadhaar Linking, Medianama (23 August 2019).
  3. Aditi Agrawal, Madras HC: Internet Freedom Foundation to act as an intervener in Whatsapp traceability case, Medianama (28 June 2019).
  4. Aditi Agrawal, Kamakoti’s proposals will erode user privacy, says IIT Bombay expert in IFF submission, Medianama (27 August 2019).
  5. Prabhati Nayak Mishra, TN Government Bats for Aadhaar-Social Media Linking; SC Issues Notice in Facebook Transfer Petition, LiveLaw (20 August 2019).
  6. Asheeta Regidi, Aadhaar-social media account linking could result in creation of a surveillance state, deprive fundamental right to privacy, Firstpost (21 August 2019).

Bangladesh bans Mobile Phones in Rohingya camps

Adding to the chaos and despair for the Rohingyas, the Bangladeshi government banned the use of mobile phones and also restricted mobile phone companies from providing service in the region. The companies have been given a week to comply with these new rules. The reason cited for this ban was that refugees were misusing their cell phones for criminal activities. The situation in the region has worsened over the past two years and the extreme violation of Human Rights is termed to be reaching the point of Genocide according to UN officials. This ban on mobile phones, would further worsen the situation in Rohingya by increasing their detachment with the rest of the world, thus making their lives at the refugee camp even more arduous.

Further Reading:

  1. Nishta Vishwakarma, Bangladesh bans mobile phones services in Rohingya camps, Medianama (4 September 2019).
  2. Karen McVeigh, Bangladesh imposes mobile phone blackout in Rohingya refugee camp, The Guardian (5 September 2019).
  3. News agencies, Bangladesh bans mobile phone access in Rohingya camps, Aljazeera (3 September 2019).
  4. Ivy Kaplan, How Smartphones and Social Media have Revolutionised Refugee Migration, The Globe Post (19 October 2018).
  5. Abdul Aziz, What is behind the rising chaos in Rohingya camps, Dhakka Tribune (24 March 2019).

YouTube to pay 170 million penalty for collecting the data of children without their consent

Alphabet Inc.’s Google and YouTube will be paying a $170 million penalty to the Federal Trade Commission. It will be paid to settle allegations that YouTube collected the personal information of children by tracking their cookies and earning millions through targeted advertisements without parental consent. The FTC Chairman, Joe Simons, condemned the company for publicizing its popularity with children to potential advertisers, while blatantly violating the Children’s Online Privacy Protection Act. The company has claimed to advertisers, that it does not comply with any child privacy laws since it doesn’t have any users under the age of 13. Additionally, the settlement mandates that YouTube will have to create policies to identify content that is aimed at children and notify creators and channel owners of their obligations to collect consent from their parents. In addition, YouTube has already announced that it will be launching YouTube Kids soon which will not have targeted advertising and will have only child-friendly content. Several prominent Democrats in the FTC have criticized the settlement, despite it being the largest fine on a child privacy case so far, since the penalty is seen as a pittance in contrast to Google’s overall revenue.

Further Reading:

  1. Avie Schenider, Google, YouTube To Pay $170 Million Penalty Over Collecting Kids’ Personal Info, NPR (4 September 2019).
  2. Diane Bartz, Google’s YouTube To Pay $170 Million Penalty for Collecting Data on Kids, Reuters (4 September 2019).
  3. Natasha Singer and Kate Conger, Google Is Fined $170 Million for Violating Children’s Privacy on YouTube, New York Times (4 September 2019).
  4. Peter Kafka, The US Government Isn’t Ready to Regulate The Internet. Today’s Google Fine Shows Why, Vox (4 September 2019).

Facebook Data Leak of Over 419 Million Users

Recently, researcher Sanyam Jain located online unsecured servers that contained phone numbers for over 419 million Facebook users, including users from US, UK and Vietnam. In some cases, they were able to identify the user’s real name, gender and country. The database was completely unsecured and could be accessed by anybody. The leak increases the possibility of sim-swapping or spam call attacks for the users whose data has been leaked. The leak has happened despite Facebook’s statement in April that it would be more dedicated towards the privacy of its users and restrict access to data to prevent data scraping. Facebook has attempted to downplay the effects of the leak by claiming that the actual leak is only 210 million, since there are multiple duplicates in the data that was leaked, however Zack Whittaker, Security Editor at TechCrunch has highlighted that there is little evidence of such duplication. The data appears to be old since recently the company has changed its policy such that it users can no longer search for phone numbers. Facebook has claimed that there appears to be no actual evidence that there was a serious breach of user privacy.

Further Reading:

  1. Zack Whittaker, A huge database of Facebook users’ phone numbers found online, TechCrunch (5 September 2019).
  2. Davey Winder, Unsecured Facebook Server Leaks Data Of 419 Million Users, Forbes (5 September 2019).
  3. Napier Lopez, Facebook leak contained phone numbers for 419 million users, The Next Web (5 September 2019).
  4. Kris Holt, Facebook’s latest leak includes data on millions of users, The End Gadget (5 September 2019).

Mozilla Firefox 69 is here to protect your data

Addressing the growing data protection concerns Mozilla Firefox will now block third party tracking cookies and crypto miners by its Enhanced Tracking Protection feature. To avail this feature users will have to update to Firefox 69, which enforces stronger security and privacy options by default. Browser’s ‘Enhanced Tracking Protection’ will now remain turned on by default as part of the standard setting, however users will have the option to turn off the feature for particular websites. Mozilla claims that this update will not only restrict companies from forming a user profile by tracking browsing behaviour but will also enhance the performance, User Interface and battery life of the systems running on Windows 10/mac OS.

Further Readings

  1. Jessica Davies, What Firefox’s anti-tracking update signals about wider pivot to privacy trend, Digiday (5 September 2019).
  2. Jim Salter, Firefox is stepping up its blocking game, ArsTechnica (9 June 2019).
  3. Ankush Das, Great News! Firefox 69 Blocks Third Party Cookies, Autoplay Videos & Cryptominers by Default, It’s Foss (5 September 2019).
  4. Sean Hollister, Firefox’s latest version blocks third-party trackers by default for everyone, The Verge (3 September 2019).
  5. Shreya Ganguly, Firefox will now block third-party tracking cookies and cryptomining by default for all users, Medianama (4 September 2019).

Delhi Airport T3 terminal to use ‘Facial Recognition’ technology on a trial basis

Delhi airport would be starting a three-month trial of the facial recognition system in its T3 terminal. This system is called the Biometric Enabled Seamless Travel experience (BEST). With this technology, passenger’s entry would be automatically registered at various points such as check-in, security etc. Portuguese company- toolbox has provided the technical and software support for this technology. Even though this system is voluntary in the trial run the pertinent question of whether it will remain voluntary after it is officially incorporated is still to be answered. If the trial run is successful, it will be officially incorporated.

Further Reading:

  1. Soumyarendra Barik, Facial Recognition tech to debut at Delhi airport’s T3 terminal; on ‘trial basis’ for next three months, Medianama (6 September 2019).
  2. PTI, Delhi airport to start trial run of facial recognition system at T3 from Friday, livemint (5 September 2019).
  3. Times Travel Editor, Delhi International Airport installs facial recognition system for a 3 month trial, times travel (6 September 2019).
  4. Renée Lynn Midrack, What is Facial Recognition, lifewire (10 July 2019).
  5. Geoffrey A. Fowler, Don’t smile for surveillance: Why airport face scans are a privacy trap, The Washington Post (10 June 2019).

UK Court approves use of facial recognition systems by South Wales Police

In one of the first cases of its kind a British court ruled that police use of live facial recognition systems is legal and does not violate privacy and human rights. The case, brought by Cardiff resident Ed Bridges, alleged that his right to privacy had been violated by the system which he claimed had recorded him at least twice without permission, and the suit was filed to hold the use of the system as being violative of human rights including the right to privacy. The court arrived at its decision after finding that “sufficient legal controls” were in place to prevent improper use of the technology, including the deletion of data unless it concerned a person identified from the watch list.

Further Reading:

  1. Adam Satariano, Police Use of Facial Recognition Is Accepted by British Court, New York Times (4 September 2019).
  2. Owen Bowcott, Police use of facial recognition is legal, Cardiff high court rules, The Guardian (4 September 2019).
  3. Lizzie Dearden, Police used facial recognition technology lawfully, High Court rules in landmark challenge, The Independent (4 September 2019).
  4. Donna Lu, UK court backs police use of face recognition, but fight isn’t over, New Scientist (4 September 2019).

Read more
  • Previous
  • 1
  • 2
  • 3
  • 4
  • Next

Subscribe

Recent Posts

  • Analisis Faktor-Faktor yang Berhubungan dengan Kejadian Ketuban Pecah Dini di RSUD Lamaddukelleng Kabupaten Wajo
  • The Fate of Section 230 vis-a-vis Gonzalez v. Google: A Case of Looming Legal Liability
  • Paid News Conundrum – Right to fair dealing infringed?
  • Chronicles of AI: Blurred Lines of Legality and Artists’ Right To Sue in Prospect of AI Copyright Infringement
  • Dali v. Dall-E: The Emerging Trend of AI-generated Art
  • BBC Documentary Ban: Yet Another Example of the Government’s Abuse of its Emergency Powers
  • A Game Not Played Well: A Critical Analysis of The Draft Amendment to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021
  • The Conundrum over the legal status of search engines in India: Whether they are Significant Social Media Intermediaries under IT Rules, 2021? (Part II)
  • The Conundrum over the legal status of search engines in India: Whether they are Significant Social Media Intermediaries under IT Rules, 2021? (Part I)
  • Lawtomation: ChatGPT and the Legal Industry (Part II)

Categories

  • 101s
  • 3D Printing
  • Aadhar
  • Account Aggregators
  • Antitrust
  • Artificial Intelligence
  • Bitcoins
  • Blockchain
  • Blog Series
  • Bots
  • Broadcasting
  • Censorship
  • Collaboration with r – TLP
  • Convergence
  • Copyright
  • Criminal Law
  • Cryptocurrency
  • Data Protection
  • Digital Piracy
  • E-Commerce
  • Editors' Picks
  • Evidence
  • Feminist Perspectives
  • Finance
  • Freedom of Speech
  • GDPR
  • Insurance
  • Intellectual Property
  • Intermediary Liability
  • Internet Broadcasting
  • Internet Freedoms
  • Internet Governance
  • Internet Jurisdiction
  • Internet of Things
  • Internet Security
  • Internet Shutdowns
  • Labour
  • Licensing
  • Media Law
  • Medical Research
  • Network Neutrality
  • Newsletter
  • Online Gaming
  • Open Access
  • Open Source
  • Others
  • OTT
  • Personal Data Protection Bill
  • Press Notes
  • Privacy
  • Recent News
  • Regulation
  • Right to be Forgotten
  • Right to Privacy
  • Right to Privacy
  • Social Media
  • Surveillance
  • Taxation
  • Technology
  • TLF Ed Board Test 2018-2019
  • TLF Editorial Board Test 2016
  • TLF Editorial Board Test 2019-2020
  • TLF Editorial Board Test 2020-2021
  • TLF Editorial Board Test 2021-2022
  • TLF Explainers
  • TLF Updates
  • Uncategorized
  • Virtual Reality

Tags

AI Amazon Antitrust Artificial Intelligence Chilling Effect Comparative Competition Copyright copyright act Criminal Law Cryptocurrency data data protection Data Retention e-commerce European Union Facebook facial recognition financial information Freedom of Speech Google India Intellectual Property Intermediaries Intermediary Liability internet Internet Regulation Internet Rights IPR Media Law News Newsletter OTT Privacy RBI Regulation Right to Privacy Social Media Surveillance technology The Future of Tech TRAI Twitter Uber WhatsApp

Meta

  • Log in
  • Entries feed
  • Comments feed
  • WordPress.org
best online casino in india
© 2025 Tech Law Forum @ NALSAR | Powered by Minimalist Blog WordPress Theme