Artificial intelligence is often celebrated as the very tool of sustainability - optimal energy grids, climate risk prediction, and enhanced agriculture. However, it' s the systems powering AI that generate a substantial carbon footprint. Training a single large AI model can emit hundreds of tons of CO₂ , equivalent roughly to the lifetime emissions of five passenger cars. This hidden cost arises because contemporary AI relies on massive data centers and 24/7 cloud servers running at power-hungry GPUs. An OECD 2023 report says globally data centers currently account for approximately 1-2% of global electricity consumption (Masanet et al. 2020) and 2-3% of greenhouse
gas emissions. This proportion increases with each new model introduced.

The question then arises can intelligence be genuinely intelligent if it lacks sustainability?

The carbon footprint of AI is for the most part invisible to users. Every AI query and model inference is supported by a data center, comprising power-consuming servers, air conditioning units for chip cooling, and standby backup generators. Approximately 40% of a data center's energy consumption is allocated to cooling. Even routine uses of AI incur a measurable cost; one analysis finds a single ChatGPT query uses about

0.14 kilowatt-hours and consumes a few liters of water for cooling according to Columbia Climate School 2023 report. In 2021, Patterson et. al wrote how this effect is amplified by large language models GPT-3 (175 billion parameters) requiring approximately 1,287 MWh and released approximately 502 metric tons of CO₂ during training. And once deployed, inference-answering queries-draws power continuously; Google estimates 60% of AI's energy goes into inference rather than training. The International Energy Agency has warned that data centers may need 945 TWh of lectricity by 2030 due to the rising demand for data center electricity worldwide. To put it briefly, there is now a real carbon cost associated with each model we train and each question we pose.

Policymakers and researchers are racing to make AI's footprintmeasurable. Traditional data center metrics like Power Usage Effectiveness (PUE) have, for instance, been complemented by new indices taking carbon into  consideration. For example, Carbon Usage Effectiveness (CUE) ties energy use to CO₂ output at kg CO₂ per kWh, providing a more holistic view of the carbon intensity of a facility. Life-cycle assessments are also being applied to AI hardware and software, capturing the associated emissions from chip manufacturing through model deployment. Early initiatives such as Stanford's ML CO₂ tracker and Microsoft's Emissions Impact

Dashboard let developers estimate the carbon footprint of model training.

Still, examples show the order of magnitude of the challenge. Training a single large model has been estimated to emit 284-285 metric tons of CO₂ . GPT-3’ s training alone produced 502 tons compared to a edium-sized car that emits roughly 5– 6tons annually, for comparison. Globally, data centers consumed about 460 terawatt-hours in 2022, according to the IEA, and this amount is likely to increase to 1,000 TWh by 2026. With domestic data center capacity expected to double from 1 to 2 GWby 2026 and reach 17 GWby 2030 roughly 8%of India's electricity this footprint is expanding quickly in India, says TERI. Such growth could overwhelm the power grid and capsize climate targets if unchecked. IFC 2023 estimate states that by 2030, data centers in India may require about 6%of the country's electricity.

The good news is that technological and policy solutions are starting to appear. Leaders in the industry are pledging to use renewable energy. Google's data centers now run entirely on renewable energy, and icrosoft and other cloud providers plan to have 100% carbon-free data centers by 2030. By providing incentives for power purchase agreements and green procurement, governments could accelerate this transition. A similar strategy might require new AI hubs to lock in long-term renewable contracts or storage, as Delhi's data center policy already requires a 12-hour battery backup according to aMeitY 2020 report.

Innovations in software and hardware are also beneficial.Waste heat is being reduced by advanced cooling innovations such as liquid immersion and AI-driven airflow; Masanet et. al in 2020 reported some new Indian facilities having PUEs as low as 1.4. Energy consumption can be further reduced by waste heat recovery and free cooling, which uses seawater or natural air. On the computing side, edge AI-inference executing on local devices-can reduce the need to transmit data to distant clouds. Looking ahead, other emerging solutions like grid-scale storage or green hydrogen could provide reliable 24× 7 clean power for AI clusters (TERI 2022), just as they are being tested in Europe’ s data hubs.

In the context of India, data and power policy are inextricably linked. As such, experts are seeking that India's infrastructure push including the IndiaAI Mission explicitly link compute expansion to clean energy. Already, a coalition of TERI-industry has estimated that greening the data centers can cut as much as 88% of their emissions. There are further opportunities in Digita0l Public Infrastructure for instance, data can be hosted on government cloud servers powered by renewables. India's states have set strong incentives for the setting up of data centers, which should be matched by clear green standards-for example, mandating on-site solar or net-zero roadmaps for new AI campuses.

This requires policy to catch up if AI is to be sustainable. At the international level, there are views that AI's energy use should enter climate commitments and monitoring. The OECD says global stocktakes for example, under the Paris Agreement currently ignore compute emissions, and integrating data centers into national inventories would help bring their impact into view. Common accounting is also required regulators could set benchmarks for instance, CUE or water-use limits for large AI facilities. Indeed, countries are already acting the imminent AI Act in the EU will force large AI systems to report energy and resource usage. In the US, new proposals would require AI data centers to publish annual carbon and water metrics.

India's policy makers can set the good example. Recommendations include increasing the ambit of environmental clearances to include data farms for instance, there was a proposal in 2023 from Rao to categorize greenfield AI centres of more than 5 MW as "Category A" projects entailing complete EIA with disclosed PUE, WUE, and CUE. Under the Energy Conservation Act, large data centers could be declared "designated

consumers" (like heavy industries) subject to mandatory audits and efficiency standards. Public sector adoption can compel market change towards "green AI". For instance, as the UK digital strategy already mandates 100% renewable energy for official IT, government AI procurements can incorporate certified net-zero or carbon-labeled computing services. Similar to how businesses report Scope 1/2 emissions, India should establish AI infrastructure in accordance with BRSR/ESG regulations.

Global cooperation is also paramount. In forums such as COP, G20, and BRICS, nations should discuss a climate-aligned data economy for instance, channelling climate finance in order to build renewable-powered ompute in the Global South. India can champion programs that enable developing countries to leapfrog to energy-efficient AI shared GPU pools, open-source model libraries, and training on clean grids. Much of the R&D will require collaboration partnerships between academia, industry, and government can speed up "green software" practices and innovation in cooling or power. While initiatives like PAT (Perform, Achieve Trade) have proven the energy savings certificates market, a similar “Green AI” certification could allow data center operators to incentivise improvements. Other initiatives like Lifecycle Assessment and carbon labelling could help to keep pace with the sustainability goals.

AI's potential is greatest when it enhances, rather than endangers, the planet. As one expert summed up, we have to reach a phase "where we're aware of the energy usage" of ourmodels and factor it into decisions. In other words, true intelligence will be judged by its sustainability. "Green AI," which refers to the creation of models and infrastructure with low carbon emissions, is an ethical and ecological requirement rather than an optional enhancement. India and the world can ensure that the next wave of innovation is climate-positive by integrating strict metrics, clean energy, and circularity into AI. Success will be measured by how lightly technology treads on the earth rather than by speed or accuracy.