A reckoning is coming thanks to some pretty amazing advances in technology over the past several years. Artificial intelligence (AI) isn’t new, but it has evolved remarkably, and as its abilities have increased, so has its need for power and resources. These demands have already begun to strain the environment, but groups are working on innovative solutions to reconcile AI’s needs while ensuring it runs smoothly.
Before we can discuss AI’s insatiable appetite for electricity and water, let’s look at the different types of AI, as not all AI is created equal, nor does all AI perform the same functions.
Two types of AI
Traditional (discriminative) AI
Traditional AI models focus primarily on discriminating between categories or predicting specific outcomes based on input data. They learn to map inputs to outputs by identifying patterns and relationships within existing datasets. Discriminative AI’s goal? Answering questions about the data, like “What is this?” or “Is this likely?”
Training for traditional AI is generally less computationally intensive compared to complex generative models, especially for well-defined, narrower tasks. Use cases include:
- Image classification: Does a picture contain a dog, cat, or car?
- Spam detection: Classifying emails as spam or not spam
- Fraud detection: Flagging suspicious transactions
- Recommendation systems: Algorithms that suggest movies or products based on past user behavior (like Netflix recommendations)
- Predictive analytics: Forecasting sales, stock prices, customer churn
- Medical diagnosis: Helping doctors identify diseases from medical images or patient data
- Natural language processing (NLP) for classification: Sentiment analysis (determining if a text is positive, negative, or neutral), named entity recognition (identifying names of people, places, organizations in text)
Generative AI
Unlike traditional AI, generative AI (or GenAI) focuses on generating new data instances that resemble the data on which it was trained. It learns underlying patterns and structure of the input data to create novel, original outputs. Its goal? TO answer “how?” or “what else?” questions by creating something new.
GenAI typically requires very large, diverse, and high-quality datasets for effective training (billions of parameters in large language models (LLMs)). It’s often referred to as a “black box” because its internal workings are complex, thanks to the deep neural network architectures that make it hard to fully understand how it generates specific outputs. Its use cases include:
- Content creation: Generating articles, emails, marketing copy, music, poetry, and scripts
- Image and video generation: Creating realistic or stylized images from text prompts (text-to-image), generating video clips, or modifying existing media
- Code generation: Writing, completing, or debugging software code
- Synthetic data generation: Creating artificial datasets for training other AI models, especially when real data is scarce or sensitive
- Drug discovery: Designing novel molecular structures with desired properties
- Product design: Creating new design concepts for products or architectures
- Chatbots and virtual assistants: Developing more human-like, conversational AI experiences that can generate nuanced responses
Consider this analogy: Imagine a student who studies thousands of pictures of cats and dogs, to tell you — for any new picture — whether it’s a cat or dog. The student learns the difference between the new categories. That’s traditional (discriminative) AI.
Now, imagine an artist who studies thousands of paintings. The goal isn’t to identify whether a painting is a landscape or portrait, but to create an entirely new, original painting in a similar style to what the artist has learned. That’s GenAI.
While distinct, traditional and GenAI are often used in conjunction with each other. A GenAI model might create personalized marketing content, and a traditional AI model might then analyze the effectiveness of that content to optimize future campaigns. A traditional AI model could detect and classify sign language from video, and GenAI could then add context and nuance to provide a more optimized translation or even generate voice output.
The problem with GenAI
The challenge lies with balancing the energy needs of GenAI with the very real benefits it offers. The rapid expansion and use of this technology have resulted in a substantial environmental footprint. GenAI requires immense computational power, resulting in considerable energy consumption and substantial water usage for cooling data centers.
GenAI models, especially LLMs like ChatGPT-4, have billions of parameters. Training these models requires enormous amounts of electricity, which translates into increased CO2 emissions, particularly when fossil fuels provide that energy. Even after training is complete, deploying and fine-tuning these models for real-world applications consumes substantial energy, as millions of people use them daily.
Electricity requirements
Data centers provide the backbone of GenAI. These temperature-controlled facilities house massive arrays of servers, storage drives, and networking equipment. They require a tremendous amount of power to operate. And the need is rising. In North America, the electricity needs of these facilities surged from an estimated 2,688 megawatts at the close of 2022 to 5,341 megawatts by the end of 2023. Globally, data centers’ electricity use reached 460 terawatt-hours in 2022.
According to the Organization for Economic Co-operation and Development, this level of consumption positions data centers as the world’s 11th largest electricity consumer that year, between Saudi Arabia (371 terawatt-hours) and France (463 terawatt-hours). Forecasts calculate that global electricity consumption for data centers may reach 1,050 terawatt-hours by 2026. If accurate, data centers would become the fifth-largest electricity consumer worldwide, surpassing Japan and falling just behind Russia.
Water requirements
Data centers, especially those running high-performance AI hardware, generate immense heat. To prevent overheating and maintain optimal performance, these facilities rely on cooling systems that often use chilled water. Every kilowatt-hour of energy consumed by a data center needs nearly two liters of water for cooling.
The scale of this water consumption? Huge. According to the World Economic Forum, “just 1 megawatt data center can use up to 25.5 million liters of water annually for cooling, equivalent to the daily water consumption of about 300,000 people.” By 2027, global AI demand could lead to water withdrawal of 1.1 to 1.7 trillion gallons — that’s 4-6 times Denmark’s total annual water withdrawal.
Tech companies have noted that their AI operations have increased water consumption. Microsoft’s water use increased 34% and Google’s water consumption rose 20% in 2022, thanks to AI-driven expansion.
A drive for innovative solutions
What, if anything, can be done about GenAI’s thirst for electricity and water? In 2024, there were 11,800 data centers globally. In March 2025, the U.S. led with the most data centers (5,426); the next-closest country, Germany, had 529. Data center demand keeps growing. By 2030, North American hyperscalers will account for 60% of global data capacity.
Globally, researchers and companies are developing strategies to enhance data center sustainability. Those solutions include:
- Limiting how much power is available and choosing more efficient hardware
- Using less expensive AI models for training, using domain-specific AI training models customized to specific fields, and distributing AI computations across multiple time zones to align workloads with peak renewable energy availability
- Powering data centers with renewable energy (solar, wind) and shifting workloads to areas with more green grids
- Designing software able to recognize and adjust for carbon emissions and impact throughout the day
- Requiring the tech companies building these massive new data centers to pay for powering them
- Optimizing cooling systems by implementing more water-efficient cooling technology, including liquid cooling, and maximizing water reuse
Some tech companies, including Microsoft, Nautilus Data Technologies, and Subsea Cloud, have launched underwater data centers and floating data centers. These centers rely on the ocean to handle the cooling. Norway’s Green Mountain Data Center relies on the country’s fjords to keep it cool. Its three-story Lefdal Mine Datacenter operates underground.
Given GenAI’s proliferation across industries (and for our own personal use), developing and deploying it sustainably has become even more critical. We can’t ignore its massive appetite for electricity and water or the e-waste it creates. The researchers, industry, and policymakers are working to ensure that its development doesn’t harm the planet and those dependent on its resources. Focusing on sustainability now ensures GenAI remains a powerful force for good as a “partner” in helping solve global problems, not cause them.
Are you a commercial real estate investor or looking for a specific property to meet your company’s needs? We invite you to speak with the professionals at CREA United, an organization comprising CRE professionals from 92 firms representing all disciplines within the CRE industry, including brokers, subcontractors, financial services, security systems, interior designers, architects, movers, IT specialists, and more.