[This post has been authored by Prajakta Pradhan, a 1st year student at Dr. Ram Manhar Lohiya National Law University (RMLNLU), Lucknow.]
Facial recognition involves the use of face mapping techniques to identify an individual’s facial features and compares it with available databanks. The facial recognition market is expected to grow to $7.7 billion in 2022 from $4 billion in 2017. The reason for this stellar growth is the varied application of facial recognition technology in both private and public sectors, with governments of many countries using facial recognition for law enforcement and surveillance.
On 11 December 2019, the Indian Parliament passed the Citizenship (Amendment) Act, 2019. Since then, there have been vehement protests all across India. The Government and police used the Facial Recognition Technology amidst the protests to identify and arrest the protestors. Facial Recognition Systems are also actively functioning at major Indian Airports. Human rights activists in India are opposing the use of this technology on the ground that it violates the right to privacy guaranteed by Article 21 of the Indian Constitution. There are many dimensions to privacy – including the privacy of a person, privacy of communications, territorial privacy, and privacy of personal data. As a result, this post will examine whether facial recognition technology – as used by the Government – in fact, violates the constitutional right to privacy.
Examining the Legal Challenges to Facial Recognition
In Justice KS Puttaswamy v. Union of India, the Hon’ble Supreme Court of India stated that the Right to Privacy is a part of the fundamental right guaranteed under Article 21 of the Indian Constitution. This judgment also expressly says that consent is required before data can be collected.
India does not have data protection laws that define the scope of this technology. Still, the Indian Government has blatantly misused its power to collect private data through multiple state surveillance schemes, which violates domestic as well as international law. Article 17 of the International Covenant for Civil and Political Rights and Article 12 of the Universal Declaration of Human Rights espouse the right to privacy against any arbitrary interference.
The implementation of the Automated Facial Recognition System (A.F.R.S.) is aggravating privacy concerns since the technologies involved have not been subject to consistent testing. Facial Recognition software can only work if there exists a database of images and videos so that the algorithm can identify and detect faces. The existence of such a database can raise concerns regarding the source, authenticity, and whether consent was obtained for the use of that image or not. Many reports clearly show that facial recognition tools which utilize a plethora of pictures that are extracted from the internet without consent are already-in-use. The absence of consent, therefore, is a clear violation of the law.
How A.F.R. Systems Infringe the Right to Privacy
In Puttaswamy, the Supreme Court of India unequivocally stated that the right to privacy stretched to public spaces. It was held that citizens cannot be subject to unjustified and unlawful collection of their personal information. The court also held that any act which violates Right to privacy like collecting personal information for law enforcement purposes must be done only if it passed the proportionality test. The Supreme Court further clarified that certain standards have to be met in order to justify interference by the State into the right to privacy of citizens. These standards are legitimate aim, proportionality between the object and the means to achieve it, and procedural standards to check the abuse of State interference. Laws concerning A.F.R. Systems and personal data protection which can act as a legal safeguard do not currently exist in India.
This understanding was fortified by the case of Vinit Kumar v. Central Bureau of Investigation. Here, the Bombay High Court stated that even in the case of an extraordinary national security concern, the justification for infringement of a right must be done through the proportionality test laid down in the 2017 Puttaswamy judgment. The case for public safety must be demonstrated, not merely claimed.
Personal Data Protection Bill, 2019 and its Impact
The Indian Government introduced the Personal Data Protection Bill (PDP), 2019, in the Lower House of the Parliament (Lok Sabha) which made A.F.R. Systems legal. However, this proposed Bill is itself marred with privacy issues. The proposed Bill provides the Government with three major exceptions: it grants the right to collect personal data without consent in matters relating to the security of the state and public order. Secondly, the personal data of an individual could also be collected to prevent or detect any offence. Lastly, the Government could also collect and use personal data for ‘reasonable’ purposes (including the data of whistle-blowers), which strikes at the protection afforded to them through their This Bill even lacked the provision to withdraw consent; therefore, it is considered a complete failure.
Furthermore, it deliberately left out the question of placing restraints on government surveillance. The presumption was that another legislation would take care of it. And as expected, that didn’t happen either.
Need for Efficient Laws in India to protect biometric data
With absolutely no safeguards in place, the Indian Government has established a regime that legally allows it surveil its citizens, allowing even private companies to get in on the act. The Indian Government has already permitted airports to use their passengers’ faces in lieu of ordinary boarding passes.
Many places around the world, including San Francisco and Oakland in California, have banned facial recognition, stating that the technology violates the right to privacy. There is no doubt that such privacy issues can arise in India, if proper measures are not taken up by the Government. The fact that India lacks any Data Protection laws only exacerbates the concern.
Conclusion
There needs to be a clear and comprehensive dialogue between agencies deploying A.I. technologies and their critics, as only multi-stakeholder discussions will reduce the risk of catastrophic failures. Still, if the Government plans on using facial recognition, it needs to institute adequate safeguards, establish redressal mechanisms, and create speedy dispute resolution forums to address any violations that may arise.
Indian citizens deserve care and deliberation before being subjected to an extremely flawed and biased technology that currently lacks any safeguards. Up until there has been public consultation, laws that provide privacy safeguards, the Government should withdraw policies that pry into citizens’ lives and focus mainly on passing the privacy law that it promised its citizens’ in the first place.