[Sreaans Shukla is a second-semester B.A. LL.B. (Hons.) student at the West Bengal National University of Juridical Sciences (WBNUJS), Kolkata. In this piece, the author explores the overlooked physical consequences of digital rights, specifically the “Right to be Forgotten” under the DPDP framework. By drawing provocative parallels between information theory and the laws of thermodynamics, the author argues that the process of data erasure in large-scale AI models is not a “neutral” act, but one that carries a significant environmental and energy cost.
The analysis moves beyond traditional privacy discourse to examine whether the mandates for absolute data deletion are sustainable in an era of climate crisis. It concludes by proposing a more nuanced “energy-aware” approach to privacy—one that balances the necessity of data protection with the physical realities of the computational power required to execute it.]
Introduction
The point of conflict between digital privacy law and provisions to protect environment is creating an unprecedented yet less talked about regulatory dilemma. There’s a clash between the Right to Erasure (Right to be Forgotten) under section 12 of the Digital Personal Data Protection (DPDP) Act, 2023 and the energy conservation needs that the Energy Conservation Act, 2001 and the Bureau of Energy Efficiency (BEE) impose. With Artificial Intelligence (AI) systems in particular Large Language Models (LLMs) no longer an experimental curiosity, but officially becoming a part of economic infrastructure, the administration of data privacy rights is no longer a frivolous administrative concern. It has turned into a thermodynamic computationally exhaustive, energy-consuming process. When applied to probabilistic weights of neural networks, right to be forgotten requires machine unlearning or retraining the entire model. Such processes turn legal compliance into thermodynamic work, which requires giga-watts of electricity and results in the creation of massive carbon emissions. This blog examines a fundamental Green Governance Gap by analysing the DPDP Rules 2025, and Landauer’s thermodynamics principle. It argues that in the absence of a redefinition of the concept of the technical feasibility into the environmental feasibility, privacy legislation poses a serious threat of becoming a major cause of anthropogenic climate change.
The Legal Framework of Data Erasure
Data users have the right to correction and erasure of personal data under section 12 of the DPDP Act, 2023. It gives the Data Principal (the user) the right to demand that the Data Fiduciary or the service provider should erase their personal information and also sets a binding duty on the Data Fiduciary or the service provider to undertake this erasure. The exceptions of the DPDP Act are restrictive in scope as compared to the Article 17(3) of General Data Protection Regulation (GDPR) which is balanced by considering the freedom of expression, public health, and scientific research as the former mainly deals with statutory retention requirements.
The Act does not give specific grounds that may be utilized to deny an erasure request such as environmental impact or energy consumption. Ministry of Electronics and Information Technology (MeitY) announced the DPDP Rules, 2025 that give the finer mechanics to this mechanism. Rule 8 is especially consequential to the compliance energy footprint. It requires a Data Fiduciary to delete personal data upon the withdrawal of consent by the Data Principal as well as the realization of the intended goal, and the exercise of the right by the Data Principal in Section 12.
Additional Burden
Rule 8(2) of DPDP provides a notification condition: The Data Fiduciary should disclose the Data Principal at least 48 hours prior to the planned erasure. This operation is meant to eliminate the risk of data loss by mistake but causes extra computational and communication workload to each deletion. Also, Rule 8(3) brings on board the counter-force: Mandatory Retention Data fiduciaries are required to keep the associated data stored for one year after deletion of primary data in case any need for cyber investigation or audit arises.
This retention creates Zombie Data which leads to storage devices remaining active and consuming energy even after the deletion of primary data. The energy cost of storing these petabytes of logs of 1.4 billion Indians is not something one can just throw away and it is the opposite of the minimization principle i.e the strategic approach of intentionally only storing data which is relevant and avoiding storing unnecessary data. Its relevance extends beyond mere efficiency to realm of climate and sustainability. It’s a foundational principle for sustainable data practice. Zombie data clearly disrupt this purpose with storage of unnecessary data which not only reduces efficiency by clustering the system but also has huge environmental cost which comes in the form of destroying the same in addition to the principal data.
The Landauer’s Principle and the Physics of Erasure
This is not an engineering limitation of energy cost of computation, but rather a law of physics. The Landauer’s Principle, developed by Rolf Landauer in 1961, states that the minimum amount of energy that is needed to erase one bit of information is E =kB T ln 2, where Boltzmann constant and T are used. This is about 2.910 -21 joules per bit at room temperature, about one billion times the hypothetical minimum. In cases where an AI model needs to forget information, it carries out billions of complex and irrevocable operations. This combines computational directions, updating parameters, and effectively destroying information about the removed points of data. Every update of the parameters is many billion erasures of bits, which adds the thermodynamic price of the operation by trillions of operations. This principle is a revelation: the process of forgetting is very entropic. Uncertainty is mitigated by information erasure which incurs energy input and heats. The cumulative effect of single bit erases in very large AI models (hundreds of billions of parameters) becomes a power usage of megawatts.
The Entropy of AI Models
The data in traditional databases are stored in a deterministic way and deleting a row is a low-entropy operation. The generative AI models, on the other hand, store data in a probabilistic manner. The personal information of a given user is not stored in a given address but it is spread across the weights of the network in the course of training. The implication of deleting this type of data under Section 12 of the DPDP Act is to make sure that the model has no ability to recreate or provide the data. This entails rewinding the process of entropy reduction which was experienced during learning. The model (reducing its internal entropy as compared to the dataset) is organized by learning; to cut specific patterns, unlearning destroys that organization. This perturbation demands a lot of energy to compute the new poses of weight to satisfy the forgotten state and preserve the utility of the model to other tasks. The discrepancy between keeping the model intelligent (order) and eliminating a particular data (disorder/erasure) form a high-energy optimization problem.
Quantifying the Carbon Cost
The privacy of large language models tells us about the astronomical price of digital forgetting. The transition from GPT-3 to GPT-4 saw a roughly 50-fold increase in energy consumption, signalling a massive surge in the power demand of large-scale AI workloads. Currently India has a data centre capacity of 1.2 GW which is expected to explode to 17 GW in 2030. These facilities already serve a share of 2 percent of the total electricity demand of India which is even projected to grow to 6 percent of the total electricity demand in India by 2030. The energy cost of forced retraining is a national issue with AI workloads potentially consuming 35-50% of the data centre power by 2030. The electricity cost of running training alone is already in millions and a single training run can consume up to $1-3 billion of energy at scale.
The Energy Conservation Mandate
The statutory protection for climate is provided by the Energy Conservation Act, 2001 (amended in 2022). The Bureau of energy efficiency (BEE) categorizes energy consuming industries as Designated Consumers (DCs). Such organizations are required to comply with rigid requirements of Specific Energy Consumption (SEC) and make compulsory energy audits. The definition of DCs is being extended, even though the conventional process included steel and cement. It is anticipated that soon Data Centres will be punishable by the law in case their energy efficiency (Power Usage Effectiveness – PUE) does not increase. The Renewable Consumption Obligation (RCO) was set up by Ministry of Power’s 2025 notification. This requirement substitutes the previous Renewable Purchase Obligation (RPO), and it is on Designated Consumers directly. RCO Compliance stipulates that this additional amount of increased total must be composed of renewable sources ranging to about 43%. Renewable energy has to be procured in large quantities by mass in a bid to defray the privacy load in data centres. In case the renewable supply remains limited, they will either miss RCO targets or cause the grid to burn additional coal to cover the baseload, pushing the cost of green energy to other industries.
The Proposed Solution
As a way to address the regulatory tension between the privacy and climate requirements, it will be necessary to establish a new form of jurisprudence that functions as a form of Green Privacy, i.e., a reformulation of the “technical feasibility” exemption in the DPDP Act, 2023, Section 12(3) into a new version of the exemption called the “thermodynamic feasibility” exemption. Now, the legislation poses a threat of equating the erasure to the complete removal of any kind of data traces, which, in the case of Large Language Models (LLMs), requires full retraining of the models, which requires hundreds of tonnes of carbon dioxide and costly gigawatt-hours of energy to execute even simple compliance requests.
A sustainable solution requires a two-fold regulation change. To begin with, the MeitY should announce standards certifying Approximate Machine Unlearning (which include gradient ascent or SISA algorithms) as legally adequate to fulfil Section 12. These methods, unlike full retraining, enable models to forget individual data with 99 percent efficiency using insignificant amounts of energy (typically, only kilowatt-hours) and essentially uncouple privacy rights and carbon emissions.
Further this standard of the law should be aligned with the Energy Conservation Act, 2001. Algorithms Efficiency should be incorporated into the Carbon Credit Trading Scheme (CCTS) by the Bureau of Energy Efficiency (BEE). These low-carbon unlearning protocols should be encouraged to be embraced by the Data Centres which are now categorized as Designated Consumers. India may help establish a Sustainable-by-Design ecosystem by making Significant Data Fiduciaries, such as Energy Impact Assessments (EnIA) and Data Protection Impact Assessments (DPIA), a legal requirement, so that the cost of digital forgetting does not enrich the physical environment. This balanced solution ensures the Right to Erasure is not turned into the Right to Pollute.
Conclusion
Balancing both climate protection and right to be forgotten is important. Landauer’s Principle serves as a reminder that information is physical such that any bit of information that is erased has an energy cost calculated in joules, megawatts, and eventually carbon emissions. India is at critical crossroad. Being an aspiring global hub of AI and having high climate goals, it cannot afford to view privacy and energy policy as two distinct fields. The DPDP Act and Energy Conservation Act should be better coordinated with a view to appreciating that any digital rights have carbon footprint. The answer does not lie in the weakening the privacy safeguards, but in making legal and technical innovations. Individual autonomy and planetary boundaries can be maintained using machine unlearning standards, renewable energy requirements and batch processing to safeguard the planet. Carbon cost of forgetting needs to be taken into consideration not as an afterthought but as a key point of consideration in the way we regulate the digital era.