paint-brush
AI's Dirty Secret: The Hidden Cost of its Environmental Impactby@viceasytiger
1,840 reads
1,840 reads

AI's Dirty Secret: The Hidden Cost of its Environmental Impact

by Vik BogdanovJanuary 17th, 2024
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

As AI's carbon footprint expands, the environmental impact becomes an increasingly critical issue. Check out 5 ways to build a sustainable AI-powered future.
featured image - AI's Dirty Secret: The Hidden Cost of its Environmental Impact
Vik Bogdanov HackerNoon profile picture


As AI's carbon footprint expands, the environmental impact becomes an increasingly critical issue. The challenge lies in harnessing AI's transformative power while ensuring its development and deployment do not exacerbate global ecology problems.


Artificial Intelligence (AI) has gained significant traction over the past few years, especially with advanced language models like ChatGPT. However, as AI becomes more embedded in our daily lives and business operations, its environmental impact, particularly concerning the carbon footprint and electronic waste, is increasingly scrutinized.

The Energy Cost of AI

A critical concern with AI is its energy consumption. Did you know that training an AI model to recognize a car would involve processing millions of images, requiring significant computational power? Crucial for such processes, data centers contribute 2-4% of global CO2 emissions, a figure comparable to the aviation industry.


In 2019, the University of Massachusetts Amherstfound that training a single AI model could emit over 626,000 pounds of CO2, equivalent to the emissions of five cars over their lifetimes. This stark comparison underlines the substantial environmental impact of AI, extending beyond the training phase to ongoing operations.



recent study found that training a large neural network with 175 billion parameters consumed 1287 MWh of electricity. It resulted in carbon emissions of 502 metric tons, equivalent to driving 112 gasoline-powered cars for a year.


In the United States, data centers where AI models are trained are already major consumers of electricity, representing approximately2% of the nation's total usage. These centers demand significantly more energy than standard office spaces, requiring 10 to 50 times more power per unit of floor area. Another study highlights the energy needs of AI models like ChatGPT, likening its consumption to "drinking" a 500ml bottle of water for every 20-50 interactions it handles, with its successor, GPT-4, demonstrating an even higher energy demand.


Generative AI models, notable for creating realistic images and texts, are particularly energy-intensive. These models are larger and more complex, requiring extensive knowledge bases. For instance, generating 1,000 images with a powerful AI model, like Stable Diffusion XL, emits as much CO2 as driving an average car for 4.1 miles.


The cost of generating content varies with the user's prompt, making it difficult to predict these models' running and scaling costs. The necessity for high computing power to maintain service availability drives up infrastructure costs.

On top of that, the environmental impact of electronic waste (e-waste) from AI technology is a significant concern. This waste includes harmful chemicals like mercury, lead, and cadmium, which can leach into the soil and water, posing risks to human health and the ecosystem.


According to the World Economic Forum (WEF) predictions,



e-waste will exceed 120 million metric tonnes by 2050


Managing e-waste responsibly and recycling it is crucial to prevent environmental damage and limit the release of toxic substances. Stricter regulations and ethical disposal methods are necessary to handle and recycle e-waste associated with AI safely, thereby mitigating its adverse environmental impacts.

Financial and Environmental Implications of Generative AI

The financial and environmental costs of generative AI are significant. Conservative estimates place the cost of running ChatGPT at around $100,000 daily, approximately $3 million monthly. With increasing usage, these costs could soar to $40 million per month.


The ICT sector, which includes AI infrastructure,accounts for about 2% of global CO2 emissions. As generative AI models grow, their carbon emissions are expected to increase proportionately.

Mitigating AI's Carbon Footprint

The challenge of mitigating AI's carbon footprint and associated costs is multifaceted, requiring a combination of technological innovation, efficient practices, and strategic planning. According to Gartner, here are five ways to develop more sustainable AI:


  1. Make AI as Efficient as the Human Brain: Adopting composite AI, which uses network structures similar to the human brain, can increase efficiency. This approach uses knowledge graphs, causal networks, and symbolic representations to effectively solve a broader range of business problems.
  2. Put Your AI on a Health Regimen: Monitoring energy consumption during machine learning is crucial. Training should be stopped when improvements plateau and no longer justify the energy costs. Other practices include keeping data for model training local while sharing improvements centrally, reusing already trained models, and using more energy-efficient hardware and networking equipment.
  3. Run AI in the Right Place and at the Right Time: Managing the timing and location of AI workloads can significantly impact carbon emissions. This involves considering the carbon intensity of local energy supplies, balancing data center workloads for optimal energy production and water efficiency, and using energy-aware job scheduling with carbon tracking and forecasting services.
  4. Buy New Clean Power Where You Plan to Consume It: Organizations should consider power purchase agreements (PPAs) or sourcing renewable energy certificates (RECs) that add new renewable energy to the grid in the areas where they consume electricity. Preparing for future protocols and building a detailed plan for clean power by location and time of day can also help develop a sustainable strategy.
  5. Make Environmental Impact a Key Factor in AI Use Cases: Integrating environmental impacts into the AI strategy is essential. This means moving forward with use cases that create more value than they destroy, improving the energy efficiency of existing AI initiatives, and avoiding investments in AI use cases that could harm business value or the environment.


By incorporating these strategies, the AI industry can significantly reduce its environmental impact while maintaining its growth and innovation potential. As AI continues to evolve, striking a balance between technological advancement and environmental responsibility will be critical to sustainable development. And corporates should set an example here.



Big Tech Takes Action

Google emphasizes the need for a collaborative approach involving policymakers, urban planners, business leaders, and individuals to unlock AI's full potential. Policymakers are particularly important, as they can facilitate AI's role in climate action by promoting data sharing, making technology accessible, and supporting initiatives for technology and climate-related skills development in businesses.


Google's strategies to diminish AI's carbon footprint include efficient practices that can reduce the energy needed to train AI models by up to 100 times and lower associated emissions by as much as 1,000 times. The company points out that its data centers are over 1.5 times more energy-efficient than typical enterprise data centers, with an average annual power usage effectiveness (PUE) of 1.10, compared to the industry average of 1.55.


The tech giant also mentions its climate-aware approach to cooling data centers and its commitment to responsible water use. In its ongoing efforts to apply AI for environmental benefits, Google is experimenting with a project in Greater Manchester that employs AI to decrease stop-and-go traffic.


While the future energy requirements of AI remain uncertain, one aspect is evident: Microsoft's leadership in generative AI in 2023 has propelled the entire tech industry forward, necessitating increased energy consumption in one form or another.


A representative from Microsoft expressed the company's ongoing commitment to achieving a future where zero-carbon sources entirely fuel the world's power grids.

In a recent interviewwith KUOW, Microsoft's Chief Sustainability Officer Melanie Nakagawa discussed how the company's AI advancements align with its decarbonization objectives. Despite AI's short-term energy demands, Nakagawa emphasized its potential to uncover innovative methods for reducing Microsoft's carbon footprint, including using AI to enhance renewable energy access.

Microsoft is exploring the use of AI to simplify the regulatory hurdles associated with launching new nuclear power plants in the U.S., as The Wall Street Journal reported. This initiative forms part of Microsoft's broader strategy to incorporate nuclear power and nuclear fusion into its sustainability and AI development plans. The company has partnered with Helion, a fusion startup based in Washington, and has committed to purchasing fusion power from Helion by 2028, marking a potentially groundbreaking agreement in the energy sector.


In parallel, Microsoft is focusing on making AI training more efficient, for example, by utilizing textbooks rather than extensive internet text databases.


Wrapping up, as the industry moves forward, it is clear that the energy demands of AI, particularly in the realm of generative AI, will shape the future of technological development.

The responsibility lies not only with the tech giants but also with policymakers, business leaders, and individuals to ensure that this progress does not come at the cost of our planet's health. 2023 marked a pivotal moment in this journey, pushing the tech industry towards a more energy-conscious future and highlighting the imperative for all stakeholders to actively create a sustainable path for AI innovation.


And what’s your take on this?


Don’t forget to check out my previous article on EV revolution’s hidden challenge.