This is the second part of a two-part article by Ankush Rai, a 3rd year student at NALSAR University of Law. Part 1 can be found here.
[This is the first part of a two-part article by Ankush Rai, a 3rd year student at NALSAR University of Law.]
In a recent case, the Delhi High Court accepted that summons could be served by WhatsApp and also stated that a ‘double tick’ would prima facie imply that summons have been duly delivered. This case serves as an example of how courts in India have gradually allowed for summons to be served through various electronic means. Additionally, this case also brings forth two larger points for consideration. Firstly, law and society are constantly in flux and one should adapt to the changes in the other. In this case, the law has adapted to the technological changes in society with the help of courts. Secondly, technology can be used to fulfil the larger objectives of law and justice in an effective and efficient manner. Herein, sticking to the ancient and rigid means of delivering summons would have further delayed the disposal of the case. By accepting WhatsApp messages as summons the Court fulfilled the larger objective of an efficient and speedy trial.
This post has been authored by Unmekh Padmabhushan, a final year student of National Law University, Jodhpur.
Machine learning is the process by which a piece of software is able to expand upon its capabilities and knowledge in a self-driven manner without any significant human input. This technology has been used, for example, in disaster warning systems and in driverless cars. Another scholarly use of such technology allows robots to derive patterns and significant correlations from enormous databases of texts in a manner impossible for human beings. This has led to led to an explosion in the ability of those working in the field of the humanities to analyse data like their natural sciences counterparts have done for years. 
This post has been authored by Aryan Babele, a final year student at Rajiv Gandhi National University of Law (RGNUL), Punjab and a Research Assistant at Medianama.
On 23rd October 2019, the Delhi HC delivered a judgment authorizing Indian courts to issue “global take down” orders to Internet intermediary platforms like Facebook, Google and Twitter for illegal content as uploaded, published and shared by users. The Delhi HC delivered the judgment on the plea filed by Baba Ramdev and Patanjali Ayurved Ltd. requesting the global takedown of certain videos which were alleged to be defamatory in nature.
This brief introduction to regulation of autonomous vehicles has been authored by Khushi Sharma and Aarushi Kapoor, second year students of Hidayatullah National Law University (HNLU), Raipur.
[Ed. Note: This article was written before the 2019 Personal Data Protection Bill had been made public. Click here for the new Bill.]
The post is the second part of a two-part series that undertakes an analysis of the technical standards and specifications present across publicly available documents on Account Aggregators. In the previous post, the authors looked at the motivations for building AAs and some consumer protection concerns that emerge in the Indian context.
Account Aggregators (AA) appear to be an exciting new infrastructure, for those who want to enable greater data sharing in the Indian financial sector. The key data being shared will extensive personal information about individuals like us – detailing our most intimate and sensitive financial transactions and potentially non-financial data too. This places individuals at the heart of these technical systems. Should the systems be breached, misused or otherwise exposed to unauthorised access the immediate casualty will be the privacy of the people whose information is compromised. Of course, this will also have an impact on data quality across the financial sector.
It is heartening to note that the AA framework builds in a layer of software to operationalise “consent”, ostensibly to overcome consumer data protection and privacy concerns. As we set out in our previous post, consent is a necessary but not sufficient condition for effective data protection. AAs must have strong data accountability systems and access controls that operate independent of consent, to truly offer Privacy By Design. This will be crucial in engendering trust in this new system.
In this post, we introduce the concept of Privacy by Design (PbD) and use that framework to assess certain technical specifications that have been released for the AA architecture. From our analysis, we attempt to identify whether the PbD principles have been applied to the designing process of the technical architecture of AAs.
Privacy by Design (PbD) is a widely recognised approach that addresses the emerging systemic effects of Information and Communication Technology (ICT). Ideally, all information infrastructure must be designed in a privacy-protecting manner that does not create trade-offs between the privacy of the individuals and the efficiency of the infrastructure. Principles to enable such technological design are incorporated in the PbD framework (Cavoukian, 2011).
The PbD framework comprises of seven foundational principles that intend to make IT-interacting entities privacy assuring as a default mode of operation. These principles are as follows:
While PbD principles are not an authoritative privacy framework within which information systems can be assessed, this framework can be operationalised through several technological practices and techniques to enhance privacy of the information systems.
In order to assess how AAs fare against the PbD principles, it is necessary to understand the major flows of information within the AA ecosystem. These are between the Financial Information Providers (FIP), AAs, Financial Information Users (FIU) and consumers (users).
Figure 1 and Table 1 below seek to represent the flows that occur systematically between different entities of the AA ecosystem in a simplified form.
This figure shows our representation of three sets of queries and three sets of data flows that are necessary in the AA system. These are summarised in Table 1 (mapping to the corresponding numerals in Figure 1 above).
Table 1: Query flows and data flows in the Account Aggregator ecosystem
Column 1: Query Flows
|Column 2: Data Flows|
|1||FIU queries the AA for FI||4||User provides AA with Consent|
|2||AA queries the User for Consent||5||FIP transfers information to the AA|
|3||AA queries the FIP for FI||6||AA transfers information to FIU|
In the AA system,
In following section, we identify whether the PbD principles have been applied to the designing process of the technical architecture of the AAs.
When a customer approaches a new financial institution for a financial product, that financial institution (or FIU) can query the AA for relevant information about such customer. The FIU creates a request for consent, or a query to obtain FI from the consumer2 In this query, the FIU must provide specifications on the required information such as account types, asset types, transaction types etc. and the duration and period in which the FIU will use this FI (NeSL Asset Data Limited, 2018).
The IT measures stipulated for this query flow are that the application programming interface (API) calls between the FIU and AA must take place over a Secure Socket Layer (SSL) (Hypertext Transfer Protocol Secure (HTTPS) is the application layer in this case). Each of these API calls must be digitally signed and encrypted. All internet-facing services (such as the FIU portal provided by the AA) should be placed in demilitarized zone (DMZ) (NeSL Asset Data Limited, 2018).
Purpose specification appears to be a requirement for requesting FI. ReBIT provides Purpose Definitions which include the categories Personal Finance, Financial Reporting and Account Query and Monitoring with further subcategories (ReBIT, n.a.). It can be assumed that the request for consent created by the FIU stipulates the relevant Purpose Definition(s).
In this flow, the privacy and security measures appear to point to the PbD principles of Positive Sum not Zero Sum and End-to-End Security – Full Lifecycle Protection.
Upon receiving the request for consent from the FIU, the AA performs the function of notifying the user that their information is being queried. The user is notified of the FIU, and the classes of information that this FIU is seeking. The AA provides a user interaction front-end (such as a web portal or a mobile device app) to the user through which they may receive these notifications (ReBIT, 2019; NESL Asset Data Limited, 2018).
The IT security measures for this query flow stipulate that the user be required to log-in to a web portal (provided by the AA) or the mobile app to either accept or reject the request for consent from the AA. This web portal or mobile app should be placed in DMZ and must be firewalled. It is also stipulated that a one-time password (OTP) be used for authentication for giving or revoking consent by the user (NESL Asset Data Limited, 2018).
In this flow, the privacy and security measures appear to point to the PbD principles of Positive Sum not Zero Sum and Visibility and Transparency – Keep it Open.
If the user approves the request for consent from the AA, a digitally signed artefact is generated and sent to the FIP along with the request for information initiated by the FIU. The FIP may store the consent artefacts to validate against future requests of information with respect to the same user and FIU (ReBIT, 2019; NESL Asset Data Limited, 2018).
It is stipulated that this query flow, and all other internal networks be firewalled (NeSL Asset Data Limited, 2018).
In this flow, the privacy and security measures appear to point to the PbD principle of Positive Sum not Zero Sum.
When the user receives the notification from their AA web portal or mobile app, the user may approve or reject such a request. If the user approves the request for consent, the consent artefact thus created is sent to the AA (Reserve Bank of India, 2018).
The IT security measures for this data flow stipulate that all payloads be digitally signed by the requester (in this case, the consent artefact should be digitally signed by the user). The AA client never sees the account of the user or directly participates in consent generation (ReBIT, 2019). All personally identifiable information (PII) (such as account numbers, phone numbers, personal identifiers etc.) present in the User Identifier section of the consent artefact must be tokenised using virtual IDs. The internet-facing services (the AA front-end portal provided by the AA) must be placed in the DMZ (NeSL Asset Data Limited, 2018).
In this flow, the privacy and security measures appear to point to the PbD principles of Positive Sum not Zero Sum, End-to-End Security – Full Lifecycle Protection and Visibility and Transparency – Keep it Open. However, it is difficult to state whether the principle of Respect for User Privacy — Keep it User-Centric given the central importance of consent as a sufficient safeguard for user privacy.
Once the AA forwards the consent artefact to the FIP, the FIP validates the same and responds with the information requested for in the consent artefact. When the FIP notifies the AA that the required information is available, the AA pulls the information from the which is stored in its transient store. The required information must be returned in real-time.
The IT security measures for this data flow stipulate that API calls between the AA and the FIP take place over the SSL, the information be transferred only once the consent artefact has been validated and the returned information be in encrypted, digitally signed and be in a machine-readable format. Any information stored in the transient store of the AA must also be encrypted (NeSL Asset Data Limited, 2018).
In this flow, the privacy and security measures appear to point to the PbD principles of Positive Sum not Zero Sum and End-to-End Security – Full Lifecycle Protection.
After pulling the information from the FIP and storing it in the transient store, the AA notifies that the required information is ready. The FIU fetches this information from the AA.
The IT security measures for this data flow stipulate that the users’ information never be decryptable while in the transient store of the AA. The data-in-transit (FI) must also be encrypted. The AA is required to stipulate a time frame within which the FIU must pull the information from the transient store of the AA. The information may be stored in the AA for a maximum of 72 hours. Finally, it is also required that once the information has been pulled by the FIU, the AA must delete the information from its transient store (NeSL Asset Data Limited, 2018).
In this flow, the privacy and security measures appear to point to the PbD principles of End-to-End Security – Full Lifecycle Protection.
Our assessment of the AA framework reveals some causes for optimism, and some for concern.
Theoretically, the DEPA framework provides the user with complete control over how their information is used. The users also have the provision to provide, pause and revoke their consent through the web portal or mobile app of the AA.
However, this approach assumes that consent is a sufficient condition to provide user privacy. It also assumes that the user is comprehensively aware of the consequences of the consent they are providing, which is often not the case. Consent is important in providing agency and autonomy of choice to users, but cannot be considered a holistic, user-based privacy manager. The insufficiencies of consent have been discussed in the previous blog of this series.
From our analysis, it was observed that some PbD principles appear to be fulfilled, sometimes partially if not completely. PbD principles of Positive Sum not Zero Sum, End-to-End Security – Full Lifecycle Protection and Visibility and Transparency – Keep it Open appear to be accounted for in the architecture of the AAs.
However, other significant principles of Proactive not Reactive; Preventative not Remedial, Privacy as the Default Setting, Privacy Embedded into Design and Respect for User Privacy — Keep it User-Centric do not seem to be fulfilled in this architecture. While there may be aspects in the query and data flows that may point to these principles, on an overarching level, it cannot be said that these principles have been incorporated.
For instance, this is reflected in the fact that there are no technical safeguards to ensure that FIU entities do not misuse the personal data they receive from AAs. This does not incorporate the principle of Proactive not Reactive; Preventive not Remedial. We must ensure that FIUs have sound cyber security practices and codes of conduct before they are integrated in to the AA ecosystem.
A central issue that these flows force us to ask is: how will data be protected after an FIU receives it from an AA, and it disappears into the FIU’s systems? It appears that the first time that a consumers’ data by AAs it would take place in an ecosystem that is designed in a secure and privacy protecting manner. If the data is then ejected into an FIU with insecure, privacy depleting system it would defeat the entire purpose of secure data sharing infrastructures.
The accountability of FIUs and FIPs with respect to the personal data they receive through the AA system needs clarification. Mere legal threats of enforcement will be insufficient – strong technical solutions to address this risk must be developed to ensure privacy is protected after entities receive information from AAs.
These concerns need to be addressed before full-fleshed public release into the market.
It appears that most cybersecurity and privacy-protecting practices for large data infrastructures have been proposed for the AA ecosystem (European Union Agency for Network and Information Security, 2015). Practices of external security audits and suspicious transaction pattern monitoring have also been stipulated. However, safeguards on the use of FI after it has been obtained by the FIU, and other such provider obligations cannot be seen. If an FIU were to be considered a data fiduciary, it is unclear whether they would be obliged to notify data breaches occurring in their organisation internally. There appear to be no technological safeguards hardwired in to the architecture of the ecosystem that prevents misuse of FI by FIUs and legal obligations on FIUs are deemed to be sufficient.
From our analysis of the AA architecture using the PbD framework, we see that some principles of the framework have been incorporated partially, but there appear to be several disjoint points where user privacy do not take priority. This is especially important to consider given that most Indians would be first-time users of such technology, or even enabling technological infrastructures such as mobile phones or internet. Separately, the draft PDP bill stipulates Privacy by Design as a Transparency and Accountability measure that must be implemented by every data fiduciary.
Data Empowerment and Protection Architecture (DEPA) makes for a commendable first step towards empowering Indians with the control of their personal data, it must be noted that consent independent of appropriate accountability measures for the providers and data fiduciaries. Legal obligations are important for this, but there needs to a strong technological backing in terms of codifying concepts of purpose limitation and collection limitation in order to safeguard consumers and their personal information.
Cavoukian, A. (2011). Privacy by Design – The 7 Foundational Principles. Retrieved from Information and Privacy Commissioner of Ontario: https://www.ipc.on.ca/wp-content/uploads/resources/7foundationalprinciples.pdf
European Union Agency for Network and Information Security. (2015, December). Big Data Security: Good Practices and Recommendations on the Security of Big Data Systems. Retrieved from European Union Agency for Network and Information Security (ENISA) : https://www.enisa.europa.eu/publications/big-data-security/at_download/fullReport
MEITy. (2018). The Personal Data Protection Bill, 2018. Retrieved from Ministry of Electronics and Information Techonology: https://meity.gov.in/writereaddata/files/Personal_Data_Protection_Bill,2018.pdf
NeSL Asset Data Limited. (2018, June 28). Request for Proposal for Selection of Vendor for Design, Development, Installation, Integration, Configuration, Support. Retrieved from National e-Governance Services Limited (NeSL): https://www.nesl.co.in/wp-content/uploads/2018/06/NADL_RFP_26062018-update.pdf
ReBIT. (2019). Account Aggregator API. Retrieved from Swagger: https://swagger-ui.rebit.org.in/?url=https://s3.ap-south-1.amazonaws.com/api-spec-prod/api_specifications/account_aggregator/AA_1_1_1.yaml#/Consent%20Flow/post_Consent
ReBIT. (n.a.). Account Aggregator Purpose Definition. Retrieved from ReBIT: https://api.rebit.org.in/purpose
Reserve Bank of India. (2018, February 23). Master Direction- Non-Banking Financial Company – Account Aggregator (Reserve Bank) Directions, 2016 (Updated as on February 23, 2018). Retrieved from Reserve Bank of India: https://www.rbi.org.in/Scripts/NotificationUser.aspx?Id=10598&Mode=0
Zuppo, C. M. (2012). Defining ICT in a Boundaryless World: The Development of a Working Hierarchy. International Journal of Managing Information Technology (IJMIT), 13-22. Retrieved from https://s3.amazonaws.com/academia.edu.documents/38212933/4312ijmit02.pdf?response-content-disposition=inline%3B%20filename%3DDefining_ICT_in_a_Boundaryless_World_The.pdf&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAIWOWYYGZ2Y53UL3A%2F20191211%2Fu
 Information and communications technology (ICT) is an extensional term for information technology (IT) that stresses the role of unified communications and the integration of telecommunications (telephone lines and wireless signals) and computers, as well as necessary enterprise software, middleware, storage, and audiovisual systems, that enable users to access, store, transmit, and manipulate information (Zuppo, 2012).
 It must be noted that this table is only representative of the use-case of the FIU seeking information of a user from the FIPs that have the information of the user. It does not cover the other proposed functions of AAs such as providing users with consolidated views of their FIs, users seeking their own data from FIPs, allowing users to manage previously given consents etc. (NESL Asset Data Limited, 2018; Reserve Bank of India, 2018).
 For this analysis, we refer to the following available documentation: Master Direction- Non-Banking Financial Company – Account Aggregator (Reserve Bank) Directions, 2016 released by the RBI (Reserve Bank of India, 2018), Account Aggregator Key Resources on the Sahamati website and the Request for Proposal (RFP) for Selection of Vendor for Design, Development, Installation, Integration, Configuration, Support & Maintenance of Account Aggregation software by NESL Asset Data Limited (NeSL Asset Data Limited, 2018).
 Application Programming Interfaces or APIs are a set of definitions or protocols used for building or integrating different technological platforms to provide convenience in their interaction for services and/or data. For instance, Uber uses the publicly available API of Google Maps on their interface.
 Demilitarized zone, or DMZ is a sub-network that provides an additional layer of protection to the local network of a given organisation or system. The DMZ is located behind the firewall and is available to the public, i.e., the public can access resources and services present in the DMZ without being able to penetrate the local network.
 Other Purpose Codes include Personal Finance (for wealth management services and obtaining customer spending patterns, budgets or other reporting), Financial Reporting (for aggregated statements) and Account Query and Monitoring (for periodic or one-time consent for monitoring of accounts) (ReBIT, n.a.).
 Tokenisation refers to the process of replacing sensitive data or personally identifiable data with unique identification symbols (referred to as the token) that retain all the essential information about the data without compromising its security. For instance, in 2018, the Unique Identification Authority of India (UIDAI) issued a circular stating that Aadhaar holders could start using temporarily-generated virtual IDs (tokens) in lieu of sharing their Aadhaar numbers for authentications for availing various services.
 Data Fiduciary is a term used in the draft Personal Data Protection Bill, 2018 to refer to any person, including the State, a company, any juristic entity or any individual who alone or in conjunction with others determines the purpose and means of processing of personal data (MEITy, 2018).
 S(29), Chapter VII, draft Personal Data Protection Bill, 2018 (MEITy, 2018).
The Reserve Bank of India (RBI) released Master Directions on Non-Banking Financial Companies – Account Aggregators (Master Directions) in September 2016, and licences for India’s first Account Aggregators (AAs) were issued last year. From these guidelines and related documents, we understand that the purpose of Account Aggregator (AA) is to collect and share:
Given that the AA infrastructure is aimed at harnessing the value of consumer’s personal data, does it sufficiently protect them during the data sharing process? We will consider some answers to this two-part blog series. In this post we consider the motivations for the AAs, and specifically look at the consumer protection concerns if consent becomes the main strategy for user protection in a data sharing infrastructure.
The key motivation for AAs appears to break down siloes of data and enable encrypted sharing of data between firms, taking the consent of the consumer. The appeal of AAs is that they can provide of one-tap information for financial service providers. This can drive down the supply-side costs of disbursing credit. These costs include transaction costs relating to customer identification, data gathering and due diligence as well as related costs of staff and offices to undertake data gathering activities (George & Sahasranaman, 2013). Building this kind of infrastructure for sharing consumer data could reduce the cost of credit disbursement for every additional unit of credit (i.e. for every loan) that arise using physical documentation and customer data collection to a certain extent. This cost saving could apply to all financial products, since most require some degree of customer verification and due diligence documentation.
From the consumers’ perspective, the swifter sharing of their information may enable quicker service delivery. Financial benefit for consumers will depend on whether entities pass on the cost savings they may make or use data effectively to provide more suitable products and services.
Institutions who are seeking to use the AA system will need to invest resources to enable the kind of data-sharing that is envisioned. Additionally, AAs themselves will need to be built securely and effectively. The incentive for entities to participate in this system will be dependent on the quality of the data as well as how secure the AA ecosystem is. To protect consumers whose data is being shared, it is also important to consider if the architecture envisioned in the AA is sufficiently accountable, privacy-protecting and consumer-friendly.
The AA system adopts a system of obtaining user consent before data-sharing. This system is called the Data Empowerment and Protection Architecture (DEPA).
The DEPA framework developed by Indiastack that allows users to control the sharing and usage of their personal data amongst various services and entities. The framework also maintains the necessary safeguards for the privacy of the data, i.e. the data is used and shared only to the extent that the user allows it for a stipulated purpose. The framework is designed to be transparent, and all transactions will be made traceable and auditable (Indiastack, n.a.).
From the description above, it appears that the DEPA framework seeks to put individuals on notice that their data is being requested. It then asks for their granular consent on whether particular information records about them can be shared or not shared. This form of permission-based data sharing is known as the “Notice and Consent” model in data protection scholarship (Kemp, 2017). Under this approach, the consumer is provided with notice prior to their data being collected, informed of how it might be used in the future and takes consent for the use of the information from the consumer.
Although this might appear to grant agency and autonomy to users, in practice the Notice and Consent is known to have many failings that fails to protect users’ interests, and in most cases presents a false choice to users.
Personal data is intangible and its use results in benefits and harms which are not immediately apparent to users. Research is beginning to show that there are severe cognitive limitations that impair individuals’ ability to make informed and rational choices about the costs and benefits of consenting to the collection, use, and disclosure of their personal data (Solove, 2013). Acquisti (2004) has highlighted how immediate gratification bias — due to which individuals overvalue the immediate period as compared to all future periods – can cause individuals to make suboptimal privacy decisions. As a result of this bias, individuals have been shown to choose to receive an immediate gain from data sharing (or avoid immediate costs of protecting data) and discount costs of possible future risks. They are unable to process the effect of cumulative risk over all future periods (Acquisti, 2004).
A major structural issue today is the binary nature of the choice consumers: they can either “agree” to the terms of on which their data is collected under providers’ privacy notices or disagree and be denied the service. This “take it or leave it” scenario leaves consumers with no real choice and impacts how individuals behave when agreeing to the terms under which their personal data is collected, processed and shared (Bailey, Parsheera, Rahman, & Sane, 2018).
The concept of Account Aggregators solves for a significant problem of financial data aggregation for individuals and small and medium enterprises alike. It also appears to take consumer data protection & privacy seriously for the first time, unlike previous large architectures. However, to be effective it must be supported by strong accountability systems and access controls that operate independent of consent. Relying solely on consent is not a good idea – as a wealth of data protection and consumer protection thinking has shown that Consent is necessary but not sufficient for data protection.
A broader concern is that a highly smartphone dependent user interface such as DEPA may not address the needs of the substantial feature phone audience in India who may not have access to reasonably good internet connections and electricity. As Account Aggregators gain traction in the financial community, deliberations about its impact of different demographics of the population become important, especially as it comes during a time of increased promotion of a more digital India. Consequently, it is important to improve on consent interfaces for feature phones. This must be done with the knowledge that consent is a necessary but insufficient safeguard for users’ data.
One significant lesson from our last decade of building public data infrastructures in India has been that such projects must carefully consider potential vulnerabilities in the system that could expose people to harm if their data is compromised, misused or inaccurately recorded. In India, we would do well to learn from our past experience with creating large public infrastructure that collates personal information of individuals. The implementation of a new large data infrastructure must ensure that data protection & privacy concerns that have been raised in the past are not replicated, or worse, magnified (Omidyar Network, 2018; Khera, 2019; Sharma, 2019).
It is heartening to see that DEPA is one step towards doing so, but many more features would be necessary to ensure privacy and security by design. How does the AA infrastructure fare when analysed against Privacy By Design principles? We will analyse this in our next post.
In the second post of this series, we undertake an analysis of the technical standards and specifications present across publicly available documents on Account Aggregators. We map the technical standards to the seven principles of Privacy by Design (PbD) and deliberate on the privacy and data protection afforded by the architecture of AAs.
Bailey, R., Parsheera, S., Rahman, F., & Sane, R. (2018, December 11). Disclosures in privacy policies: Does “notice and consent” work? Retrieved from NIPFP Working Paper Series: https://www.nipfp.org.in/media/medialibrary/2018/12/WP_246.pdf
George, D., & Sahasranaman, A. (2013, April). Cost of Delivering Rural Credit in India. Retrieved from Dvara Research: https://www.dvara.com/research/wp-content/uploads/2013/04/Cost-of-Delivering-Rural-Credit-in-India.pdf
Indiastack. (n.a.). ABOUT DATA EMPOWERMENT AND PROTECTION ARCHITECTURE (DEPA). Retrieved from Indiastack: https://indiastack.org/depa/
Kemp, K. (2017, August 22). Big Data, Financial Inclusion and Privacy for the Poor. Retrieved from Dvara Research: https://www.dvara.com/blog/2017/08/22/big-data-financial-inclusion-and-privacy-for-the-poor/
 According to section 3(1)(ix) of the Master Direction- Non-Banking Financial Company – Account Aggregator (hereafter, NBFC-AA Master Directions), financial information (FI) is defined as, “information in respect of the following with financial information providers: a) bank deposits including fixed deposit accounts, savings deposit accounts, recurring deposit accounts and current deposit accounts, b) Deposits with NBFCs c) structured Investment Product (SIP) d) Commercial Paper (CP) e) Certificates of Deposit (CD) f) Government Securities (Tradable) g) Equity Shares h) Bonds i) Debentures j) Mutual Fund Units k) Exchange Traded Funds l) Indian Depository Receipts m) CIS (Collective Investment Schemes) units n) Alternate Investment Funds (AIF) units o) Insurance Policies p) Balances under the National Pension System (NPS) q) Units of Infrastructure Investment Trusts r) Units of Real Estate Investment Trusts s) Any other information as may be specified by the Bank for the purposes of these directions, from time to time” (Reserve Bank of India, 2016).
 Section 3(1)(xii) of the NBFC-AA Master Directions.
 Section 3(1)(xi) of the NBFC-AA Master Directions.
 According to the Report of the Committee on Financial Inclusion released in January 2008, identifies cost of transactions for small credit accounts as a percentage of loans (for up to INR 25,000). Broadly, some of these process steps include (i) selection of applicants, (ii) carrying out post-sanction inspections (iii) establishment costs (iv) documentation costs etc. The transaction cost of credit as a percentage of the loan amount is found to be 12.95% and 8.62% in observations from Central Bank of India and ICICI Bank respectively (C. Rangarajan Committee on Financial Inclusion, 2008).
 Immediate gratification bias is closely related to the concept of Hyperbolic Discounting but differs slightly. Hyperbolic discounting discounts utilities from future periods more heavily the further away the period is, while immediate gratification bias states that individuals disproportionately overvalue the present period and all future periods are discounted uniformly.
On 20thAugust, 2019, the Attorney General of India, K.K. Venugopal, submitted to the Supreme Court that there was a need to link the social media profiles of users with their Aadhar numbers, and if required, have platforms like Facebook and WhatsApp share this number (which acts like a unique identity) with law enforcement agencies to help detect crimes. This, he argued, is needed to check fake news, defamatory articles, anti-national content, etc. This post aims to examine the legality of this potential move in the light of the Puttaswamy decisions, as well as the fundamental rights enshrined in Articles 19 and 21.
To briefly characterise social media, it refers to any interactive technology mediated by a computer, which enables the creation and dissemination of ideas, information, opinions, career interests, and other kinds of expression through virtual communities and networks. Almost every brand, educational institution and government has a social media presence today, and this facilitates direct, one-on-one communication with users.
The Supreme Court, in the Puttaswamy v. Union of India case of 2017 [2017 (10) SCALE 1], declared the right to privacy as a fundamental right under Article 21 of the Constitution. This was unanimously held by a 9-judge bench, which overruled previous decisions that had held the right to privacy as outside the scope of Part III of the Constitution.
This case arose as a challenge to the Aadhar framework, a biometric-based scheme, which the government wished to make compulsory, if a person sought government benefits and services. The case was based on the assertion by Puttaswamy that the right to privacy is violated through the Aadhar scheme. The Attorney General argued, however, that the right to privacy is not a fundamental right in India, citing Kharak Singh v. Uttar Pradesh and M.P. Sharma v. Satish Chandra. A Constitutional bench was set up to decide the privacy question for good.
The bench unanimously overruled Kharak Singh and M.P. Sharma to the extent that they are inconsistent with the present judgment – and held that the right to privacy was a fundamental right under Article 21. The court noted that the right to bodily integrity, autonomy over personal decisions, and protection of personal information – all fall within the right to privacy. At the same time, the court also noted that this right was not absolute – it permitted exceptions, should there be a legitimate aim of the state, and the invasion of privacy was proportional to the object sought to be achieved.
In a 2018 petition by Puttaswamy himself [2018 (4) SCALE 541], the constitutional validity of the Aadhar framework (under the Aadhar Act, 2016) again came into question. The majority opinion held that the Act was legal and intra vires the Constitution in all but some respects. However, it was clarified that only those benefits and services that were in the nature of a ‘subsidy’ or a ‘government welfare scheme’ could require linking of Aadhar. Institutions like CBSE, NEET, JEE, UGC etc. could not be permitted to make Aadhar linking mandatory, since their services do not qualify as a subsidy or government scheme.
It was observed that the Aadhar issue presented a point of intersection between two fundamental rights under Article 21 – the right to personal autonomy and privacy, and the right to live with dignity. Human dignity has three elements – Intrinsic Value, Autonomy and Community Value – and these are to be looked at by the court in ‘hard cases’. The balancing of both the above rights is important and this function lies with the courts. The excessive infringement into personal autonomy for ensuring socio-economic dignity of the community cannot be permitted.
There is a strong case against the proposed linking of social media accounts with the Aadhar scheme. In the 2017 Puttaswamy decision, the test laid down to determine the legitimacy of any invasion of privacy was that first, there must be a legitimate aim of the state justifying the invasion of privacy, and second, the infringement must be proportional to the aim of the statute. The first part of the test can readily be justified, considering that the curbing of fake news, defamatory content, etc. does come across as a legitimate goal of the State. However, the proportionality of this measure cannot easily be understood or justified, as the linking of social media accounts to Aadhar would necessarily involve a highly intrusive presence of the state in our daily lives, and make it difficult for people to express their opinions without concerns of incarceration. This is because with Aadhar details linked to the social media profile, any sort of disaffection with the government of the day, expressed on social media, can be traced back to the individual making such a statement, and liable to be persecuted. Such a constant presence of the state in the lives of individuals can only be justified in a totalitarian state.
Also, under the 2018 Puttaswamy decision, the Aadhar ecosystem cannot be linked to social media services. It was clearly laid down here that only services in the nature of a subsidy or a government welfare scheme can be linked, and social media falls in neither of these categories. Therefore, the linking of Aadhar numbers to social media accounts would be in violation of the court’s ruling.
A notable point is the precarious situation that such a linking would create for the right to freedom of speech and expression, guaranteed under Article 19(1)(a) of the Constitution, in light of the recent amendments to the Unlawful Activities (Prevention) Act, which received presidential assent on August 8, 2019. Now any individual believed by the government to have committed or participated in acts of terrorism, prepared for terrorism, promoted terrorism, or to have been otherwise involved in terrorism, can be declared a terrorist, and put into custody without being charged of any offence. Additionally, this amendment puts the burden of proof on the person so declared to show that he or she is not a terrorist. This amendment, coupled with the linking of social media profiles to Aadhar, will create a draconian situation wherein a person’s posts on social media can directly result in them being put into custody for indefinite periods of time, and this cannot be remedied in cases where the person may be proved to be innocent later. This will effectively make the right under Article 19(1)(a) illusory and cosmetic.
This move would bolster the already-rampant use of the unconstitutional section 66A of the Information Technology Act. In the case of Shreya Singhal v. Union of India [(2015) 5 SCC 1] it was held that this section – which prohibited the dissemination of information with the intention to cause annoyance, inconvenience or insult – was violative of Articles 19 and 21 of the Indian Constitution, and repugnant retrospectively, since its insertion in 2009. However, a working paper of the Internet Freedom Foundation shows that even today, incomplete prosecutions under this section have not been terminated, and police across India still include it in FIRs. Linking of social media accounts to Aadhar would increase the incidence of such use of section 66A as tracing content and information back to individuals would become easier and more persons could be charged with this section.
The proposed move would be completely unconstitutional, and a blot on the rights to privacy, free speech and expression, and a life of dignity. It is essential that social media be a ‘free’ platform, where individuals can speak their minds without the fear of being incarcerated for it. The growing popularity of social media has made millions of Indians its regular users, and a lot of people’s daily communication with each other takes place through social media. Social media being policed in the proposed manner would create a draconian atmosphere and go far beyond the intended purpose of checking fake news, pornography, seditious material, etc. It would further bolster the culture of repression that exists within our country and also potentially result in many individuals being detained and tried in gross violation of their right to life.
With the advent of smartphones and numerous interactive mobile applications, listening to music through apps have become a common phenomenon across the world. However, this has created a number of issues pertaining to intellectual property in various jurisdictions including India. Section 31D of the Copyright Act was inserted through the Amendment of 2012. The provision essentially deals with statutory licensing for radio and television broadcasting of literary and musical works as well as sound recordings. Broadcasters are required to pay royalties to the copyright owner, at a rate fixed by the Copyright Board. A broadcaster wishing to communicate published work should do so by notifying copyright holders in advance. This notice includes information such as the broadcast content’s length and coverage region. Because of the restrictions placed on parties from entering into commercial negotiations to determine royalty rates, there has been huge criticism of this provision. Moreover, the owners of copyrights too are not given any mechanism to negotiate the terms of royalty with broadcasting agencies, which appears to be in violation of Article 19(1)(g) of the Indian Constitution.
The objective of Copyright Law is to protect the public interest while disseminating knowledge. To encourage further creativity, the author needs to be rewarded. However, these interests must also be controlled in order to balance competition. Non-voluntary licensing is a statutory licensing system that increases the accessibility of data while at the same time maintains the interests of the author. The Copyright (Amendment) Bill, 2010 was introduced in the Rajya Sabha. It added Section 31D to the statute at a moment when radio broadcasters were suffering losses and there was still no access to television broadcasting for a large population of Indians. The Indian Broadcasting Foundation indicated that pre-decided terms and conditions would allow broadcasters to be certain about terms and expenses. It also indicated that there would be a reduction in the amount of conflicts as writers could no longer impose unreasonable and arbitrary requirements in a court. By contrast, organizations like the Indian Music Industry vociferously protested against such licensing being imposed. They argued that it is discriminatory since it did not give the copyright owner any decision-making powers to fix royalty rates. They also stated that broadcasting facilities are in the private sector. According to them, radio broadcasters are running on profit and are already receiving concessions from the government, while television broadcasters were stable enough and did not require concessions from the government.
However, by asserting that radio and television broadcasting relied on unfair voluntary licensing, the Committee rejected these arguments. The Committee mentioned the problem of immense disputes between different High Courts over the interpretation of Section 31 when dealing with statutory licensing. In a nation with a rapidly increasing broadcasting sector, the Committee observed an effective need for simple access to works. The amendment’s fundamental goal was to align the Indian copyright system with the international treaties to which India is presently a signatory. The amendment ensures that fair use remains applicable through unique clauses dealing with the digital era in order to remain in sync with technological advances. However, in view of online dissemination, the amendments take a step beyond that of treaties dealing with statutory impediments.
Section 31D has been constitutionally challenged by the Supreme Court in Lahari Recording Company, as well as by the Calcutta High Court in Eskay Video Pvt Ltd on grounds of being ultra vires Articles 14, 19(1)(g), 21 and 300A as it does away with the commercial understanding between the copyright owners and the broadcasters.
The office memorandum (‘OM’) issued by the Department for Industrial Policy and Promotion (‘DIPP’) includes ‘internet broadcasting’ by using the definition of ‘communication to the public’. It even states that satellite communication and other methods of simultaneous communication to more than one household fall within this definition. It assumes that the section enables distinct broadcasting methods, in contrast to the simple interpretation of distinct broadcasting classes. In addition, the DIPP also fails to take into account that all broadcasting methods do not follow the same set of rules. Both the Act and the Rules distinguish between distinct broadcaster methods in terms of notice delivery, setting distinct royalty rates, etc. Therefore, owing to its arbitrary nature, the DIPP’s logic of all internet broadcasters’ following identical guidelines runs counter to Article 14.
The provision is challenged on grounds of Article 19(1)(g) and 21, as it affects the freedom to contract of the parties. The petitioners further claimed that an unnecessary price control has been put on licenses through non-voluntary licensing. In instances of essential commodities and public purpose, the price control mechanisms seem justified. It does not seem, however, that either requirement is met by copyrights. Nevertheless, it is argued that the aim of such permits is to improve access to works that are copyrighted and serve a public purpose. However, this justification does not clarify whether an essential commodity can be the subject of a copyright.
For the Article 300A challenge, the petitioners argued that depriving the copyright owners of their property without due process does not fulfill any public purpose. However, this has been considered to be a weaker argument.
In a recent Bombay HC judgement regarding the Wynk music application, it has been held that online streaming services do not fall within the purview of traditional broadcasters, especially under Section 31D. The plaintiff, Tips Industries, a music label in India owns the copyright over a significant music repository which was licensed to Wynk. The licensed terms could not be renegotiated due to some payment issues and Wynk took the support of Section 31D to state that they were allowed access to the repository as they were broadcasters. According to the Court, Section 31D does not contain works downloaded and bought.
In addition, they stated that the OM released by the DIPP was of a directory nature and did not have a statutory backing, so the explicit text could not be superseded. The Court noted that the Legislature was well aware of the presence of such broadcasters, but the statutory system still did not include them which meant they could not go beyond the legislative intent. Addressing more ancillary arguments, the Court indicated that the defendant could not invoke the provision because the Board had not fixed any royalty rates. Rules 29, 30 and 31 explicitly state that prior fixation of royalty rates is necessary for Section 31D to be invoked. It can also be argued that because of the inherent qualitative differences, ‘on-demand streaming services’ cannot fall within the broadcasting sphere. In the former, the user can select what they want to play for on-demand, whereas in broadcasting services the user can only access something depending on when the network wants to broadcast it.
Spotify India has also been entangled in a legal battle with Warner Chappell Music over the use of the latter’s music repository. For now, the Bombay HC has passed an interim order restricting their pursuance of a Section 31D and depositing a large sum of money with the HC. The Court even permitted the temporary use of the repository and the money deposited would be offset from the final disposition. As the Wynk case was adjudicated before the Bombay HC, there are speculations that the Spotify judgement will be on similar lines in terms of ‘internet broadcasting’.
With disputes regarding internet broadcasting being under the purview of Section 31D, the Department for Promotion of Industry and Internal Trade (‘DPIIT’) has responded by formulating the Copyright (Amendment) Rules, 2019. Once in force, these rules will facilitate the acquisition and uploading of material for music streaming organizations easily. The DPIIT has invited stakeholders’ comments on the draft and many have responded negatively. It has been argued that neither Section 78, nor any other provision of the Act does not permit the Central Government to alter the scope of any provision. Therefore, the draft Rules altering the scope of Section 31D are ultra vires the Copyright Act which in turn make them void. Moreover, these draft rules raise an eyebrow on its constitutionality concerns similar to those expressed on the 2016 OM due to its statutory overstepping. These internet broadcasters cannot be seen as undertaking the function of ‘communication of public’ as anyone can broadcast through these platforms over the internet. This broadcasting cannot be equated with traditional broadcasters, as it will give them undeserved property rights.
There are further doubts regarding rights contained in recordings in relation to the musical works. As per IPRS v Aditya Pandey, musical works that are licensed for sound recordings have rights existing within those sound recordings. Thus, there is no requirement of separate licenses for the work making them a bundle of rights. If this principle is incorporated, then Spotify can merely use these as precedents instead of going through elaborate statutory licensing which is worrisome.On the other hand, the Copyright Office has issued interim licenses under Section 31D (1) to Kuku and Koyal Internet, Ludhiana as per the order passed by the Punjab and Haryana HC. This highlights the divergence of opinions among the HCs with regard to such statutory licenses, requiring SC interference.
Although the Government is attempting to keep Copyright Law in tune with technological developments, there is a long way to go. The Legislature must form the law in a manner that balances the demands of the music industry and copyright owners. The distinction between establishing a copy and communicating to the public is insignificant as information in any model needs to be copied. This opens up larger issues such as the country’s need for Digital Rights Management. In addition, there has been a clear distinction between interactive and non-interactive licenses in other jurisdictions such as the USA. The former being required in all kinds of streaming whereas the latter is required for apps such as Apple Music where you can select the kind of music you would like to listen to. The American Congress adopted the Music Modernization Act (‘MMA’) last year, which made it simpler for copyright owners to obtain royalties when they stream their creations online. Under the MMA, owners will be paid through a single mechanical license database to be overseen by songwriters and labels, and digital streaming services will be responsible for the costs. In order to introduce fair pay and competition, India too should embrace such legislations while maintaining the law in line with technology. In the statutory structure, therefore, there is a necessity for revision, not just particular regulations that could supersede the parent act.
The internet treaties are the WIPO Copyright Treaty and the WIPO Performances and Phonograms Treaty.
Dani Deahl, The Music Modernization Act has been signed into law, October 11, 2018, available at https://www.theverge.com/2018/10/11/17963804/music-modernization-act-mma-copyright-law-bill-labels-congress(Last visited on August 17, 2019).
Post-mortem privacy is defined as the right of a person to preserve and control what formulates his/her reputation after death. It is inherently linked with the idea of dignity after death. The first type of opinion with respect to post-mortem privacy raises the question of how there can be a threat to the reputation of a person if he no longer exists. However, there is another school of thought which argues that when a person’s public persona or reputation is harmed after death, he might not be defamed but the ante-mortem person could. Another question that comes up, is that when a person dies, does the interest of the dead person that survives become the interest of others or is it actually his interests alone that are protected or is it both the possible scenarios?
Private law justification of post-mortem privacy
There is an English principle called “Action personalis moritur cum persona”which means that a personal cause of action dies with the person implying a negative attitude towards death. However certain EU states following the civilian tradition have allowed protection of data of the deceased. Article 40(1) of the French Data Protection Act regulates the processing of data after an individual’s death. As per the article, individuals can give instructions to data controllers providing general or specific indications about retention, erasure and communication of personal data after their death.
Justice Edward Stuart in the case of Fairstar Heavy Transport N.V v. Adkins attempted to hypothesize a possible right to property over the contents of an email. This case dealt with a request of an employer to access content of emails on personal computer of his ex-employee relating to business affairs of his company. The question that came before the Queen’s bench was “Whether the claimant had any proprietary rights over content of the emails?”. This case held that the contents of an email cannot be subjected to proprietary rights and therefore the employer does not have an enforceable proprietary claim over the content of the e-mails. The court while trying to decide existence of a possible proprietary right over the contents came up with five possible methods of construing such proprietary rights. The first method would be that the title over the content of the email remains throughout with the creator or his principal. The second method would be that upon an email being sent title of the content would pass to the recipient (drawing from the analogy of vesting of title in passing of a letter according to the principles of transfer of property). The third method would be that the recipient of an email has a license to use the content of an email for any legitimate purpose consistent with the circumstance in which it was sent. The fourth method would be that the sender of the email has a license to retain the content and use it for any legitimate purposes and finally the last method would be that the title over the content of the email is shared between the sender and all the recipients in the chain. The court analysed the veracity of existence of each of these methods in construing a possible right to property over information.
The court held that the implication of adopting the first method would be that the creator of an email would be able to assert his title against the content of the world. The court opined that implication of this option would be strange and would have far-reaching impractical consequences. The court opined that if a possible title over the content of an email remains with the creator, then such vesting of title must allow the creator to use the very same title in all possible forms, which means it should also allow the creator to exercise the title by asking recipients down the chain to delete the content of the email. However, such exercise of the title is not feasible or practical, making this very option quite redundant. The court also rejected the second method. It rejected this method on the ground that if at any given point of time an email is forwarded to multiple recipients, the question of who had the title over its content at any given point of time would be extremely confusing. The third and fourth method mix the existence of proprietary right over the content of an email with nature of use of such information that is whether it’s use is for legitimate purposes or illegitimate purposes. The court held that the nature of use of information should not be an important consideration for exercising a proprietary right of control. The fifth option was also rejected on the ground of compelling impracticality.
The advent of digital will in India: future of data protection of deceased individuals?
If we look at the “Information Technology Act, 2000” then Section 1(4) of the IT Act,2000 read with the First Schedule of the IT Act provides that the IT Act is not applicable to a will defined under clause (h) of section 2 Indian Succession Act, 1925 including any other testamentary dispositions. If we look at digital wills in foreign jurisdictions, then the most talked about legislation would be the “Fiduciary Access to digital assets and Digital Accounts Act”. This piece of legislation is enacted by Delaware, which became the first state in the United States allowing executors of a digital will the same authority to take control of a digital asset. If we look at the 2016 Delaware Code, it basically revolves around the concept of ‘digital assets’ and the idea of ‘fiduciary’, as someone who could be trusted with the digital asset. The legislation defines “digital asset” as data, text, emails, audio, video, images, sounds, social media content, health care records , health insurance records, computer resource codes, computer programs and software, user names, passwords created, generated, sent, communicated, shared, received or stored by electronic means on a digital device.The legislation also defines a “fiduciary’ as a personal representative appointed by a registrar of wills or an agent under durable personal power of attorney. It provides that a fiduciary may exercise control over any and all rights in digital assets and digital accounts of an account holder to the extent permitted under state law or federal law.
Data Protection Bill
The Data Protection Bill, 2018 provides for the “right to be forgotten” under Section 27. It refers to the ability of individuals to limit, de-link, delete, or correct the disclosure of personal information on the internet that is misleading, embarrassing, irrelevant, or anachronistic. Now, upon an individual passing away, his sensitive personal data is up on line and if there is no regulation, his rights will be infringed as many times as the data fiduciary wants and the person does not have any remedy as the bill does not take into consideration the case of deceased individuals. The dynamic nature of data is such that it cannot be deleted on its own once the person is dead. The other provisions which are there for living individuals can be applied in cases of deceased individuals as well. UnderSection 10of the Personal Data Protection Bill, 2018, the data fiduciary can store data only for a limited period of time and can use the information only for the purpose it was taken for. If the data principle wants to amend any information or remove any information, he has the right to do so and the data fiduciary without any law cannot prohibit the person to do so. The current data protection regime fails to recognize and fulfil the needs for protection of digital rights. It is pertinent to consider whether the concept of a “digital asset” and “fiduciary” as present in Delaware legislation can be emulated in India. Protection of data post death involves questions of digital succession as well as intellectual property rights which is inheritable and this has to be taken into consideration while framing a legislation pertaining to post-mortem privacy. The number of internet users is estimated to be 566 million as of December 2018, registering an annual growth of 18%.Considering the growth of internet use in India, it is pertinent to have a proper legal framework for protection of data of deceased individuals.
In re Estate of Ellsworth, No. 2005-296, 651- DE (Mich. Prob. Ct. May 11, 2005).
Fairstar Heavy Transport NV v. Adkins.  EWHC 2952 (TCC).