[The following post has been authored by Yashaswini Santuka, a third year student of NALSAR University of Law. This essay is part of an ongoing collaboration between r – TLP and the NALSAR Tech Law Forum Blog and is the second post in the series. The first entry can be found here, and the rest of series is available here.]
Female healthcare and technology related to it, like other women-centric issues, are often suppressed and kept away from the spotlight. This is the result of years of direct and indirect suppression of women and their autonomy (bodily or otherwise), which has results in an increase in the popularity of technology aimed at “empowering” women. However, if the goal of tech-empowered, health tracking apps is to enable people to make informed medical choices, femtech companies have built apps that go beyond this goal. They have managed to successfully blur the line between healthcare and technology, going so far as to becoming apps designed primarily for men and violating the privacy of those it was meant to benefit. This article seeks to address the blatantly discriminatory nature of these apps, the privacy issues that come with entering data into the apps and the legal protection that users are entitled to.
The Origin, Growth and Impact of Femtech Companies
Femtech apps for fertility solutions, women’s sexual wellness, pregnancy and nursing care have arguably existed since the origin of apps. However, they’ve only gained prominence in the past decade with the introduction of apps like Glow and Clue.
While these apps were initially seen as mediums to educate people about women’s health needs and provide personalised resources to users, they have taken a sinister turn by evolving into purely capitalist ventures bent on exploiting users’ personal data. In addition to furthering the goals of education about women’s health, these apps have built an ecosystem premised on the extraction of sensitive data from users, which includes matters like duration of their cycles, information regarding their sexual activity and positions, physical and emotional symptoms through a cycle, et al.
Professor Karen Levy identified this type of surveillance as ‘intimate surveillance’, stating that “Every technology of measurement and classification legitimates certain forms of knowledge and experience, while rendering others invisible.” In doing so, Professor Levy pointed to the reductive and highly discriminatory nature of these apps that are built for a certain body and health type making all other body types invalid for the effective use of the app.
Women’s Health or Privacy?
The discriminatory nature and practices of these apps aside, it is the privacy concerns that are sought to be identified and addressed. As per mandate, these apps explicitly ask for permission before any data is entered. While this does prevent the data from being used, the use to which the information is put remains elusive. The user knows that personal data is being collected owing to the very nature of the app; however, she is unaware of the privacy policies that guide the functioning of such apps.
Firstly, the apps through their policies claim to anonymise all data if any is collected on the off-chance that it is used. This becomes a tactic to shift blame. However, it remains ineffective even if carried out with precision. This is because most of the apps ask for users’ previously existing email IDs or other accounts to sign in. On collecting this data and anonymising it, it is still possible to trace it to its origins and identify the user’s location and other previously entered details like contact details and medical history. A significant case study was the research conducted by the Norwegian Consumer Council in 2020, which concluded that multiple apps – including popular applications like Clue – had been collectively transferring personal data to third parties who would in turn create digital profiles on users based on this data.
A solution to this issue lies in encrypting the data logged in, which though expensive is a reasonable demand on the part of users given the nature of the data being collected. Glow uses this method to comply with the HIPPA. As with most remedies, this is not an infallible alternative to protect users’ data when compared to more substantive solutions such as using secure servers or implementing a clear privacy policy determining how the data is to be used. However, it would amount to a step in the direction of enforcing privacy rights with respect to these apps.
Secondly, when corporate employers themselves track the wellness of their employees to save up on health insurance etc, the divulsion of more personal details of the individuals becomes inevitable. Of late, apps have also been created for men to track women’s health details and medical records. Unsurprisingly, these apps purport to indicate the mood of a woman, so as to help men to avoid them in that period of time. This does little more that propagating sexist stereotypes of women being moody and irrational, and are blatantly discriminatory and immoral.
The Legal Angle and Liability
In countries like the USA, these data issues are the subject of robust discussions due to the presence of legislations like the Health Insurance Portability and Accountability Act (HIPAA). The HIPPA is a 1996 federal law that limits how and with whom healthcare providers can share a consumer’s health information. However, femtech apps are not governed by this legislation as they are not run or owned by healthcare providers. These are products of large corporations that are free to mandate their own policy subject to minimal regulation. Such lacunae in the law make it convenient for these apps to continue their use of data without facing any legal repercussions.
Additionally, the apps also cannot offer the requisite technology for becoming a channel to collect professional health information. Over time the policymakers and state attorney generals gained cognizance of this situation and acted. Members of the Congress sent letters to big corporations that own the apps to remove any period trackers that obtain users’ information without any explicit notice. There have also been instances of letters sent, requiring the companies that run these apps to provide substantial information as to why they collect the data they ask for, and why this is done without seeking the users’ explicit permission for the same.
In contrast with the situation in the USA, the Indian government is seeking to create a disguised healthcare system through the National Digital Health Mission. What succeeds the law and becomes practice is not something we can comment on as of now. Digital literacy in India remains an asset of the extremely privileged, since in the absence of specific sectoral regulations, it is general privacy laws that guide the functioning of these applications. This precludes the possibility of rapid promulgation of special laws which can address this niche issue.
Privacy-By-Design: A Potential Solution
Given how issues surrounding use of personal data have taken the spotlight in recent times, it is vital that the insidious and sexist nature of femtech apps is highlighted with more urgency. The fact that awareness about the privacy of users is being spread by some femtech companies is ironic, since such companies themselves do now possess a robust privacy policy or what have come to be known as “privacy by design” measures. The adoption of such measures should be encouraged, as it furthers the cause of privacy throughout the design of the app. Such methods are also used by most corporates to protect their data from being exploited. The benefits of this system would include the end-to-end data encryption and prioritises privacy of the user over all else. This alternative to secure users’ privacy would enable the users to trust the system and use it to their benefit more holistically.
Incorporation of privacy by design measures would lead to – the user being more informed about the data that is being collected and by whom it can be accessed. This would also give the users the right to access their data, edit it and remove it as and when they like. It is only through these measures that these apps can regain the trust of the users and convince them that their data is not being misused without the users’ permission.
In a country, like India, still in its nascent stages of development of a data governance system – the privacy by design model provides a viable solution. While the country is set on developing a comprehensive data governance system it is only wise that it learns from the issue previously faced in other countries. The two-fold benefits of this will include a slightly more enhanced data privacy regime that caters to the demographic better and the reduced need to revamp the framework in order to make it safe for use.
Conclusion
A significant case of data being used as a medium to subjugate women both trans women and cis-gendered females was through the datafication of their bodies. This was observed in India through the Aadhar and the subsequent biometric IDs for each card holder. Mass data on gender would inevitably lead to invisibilisation of certain groups, more vulnerable amongst these groups would be those whose identities are still not widely accepted by the society.
As far as the Indian scenario is concerned, education remains the most important tool – users must be made to understand the nature of the violation, while companies must be encouraged to incorporate tools in their apps that give consumers more control over how their data is harvested. This allows for companies to continue to provide their services, while acknowledging that not all users are comfortable divulging such sensitive information.