Data centers are facing a crisis as they need increasingly more power to handle the demands of AI and complex workloads that require ten times the amount of energy. This is putting a strain on existing data centers and even putting a halt on new data center projects.
A recent report from the U.S. Department of Energy (DOE) estimates data center load growth has tripled over the past decade and is projected to double or triple by 2028. That’s a dramatic upward trajectory of energy demand—one that the U.S. public grid is not equipped to accommodate.
Much of the U.S. power grid was built in the 1960s and ‘70s, and the infrastructure is showing its age. Some states are already struggling to keep up with energy demand, and multiple challenges—from an increasing number of severe weather events to cyberattacks and the integration of renewable energy—are putting pressure on the existing grid.
In this article, we’ll examine the challenges surrounding data center electricity consumption and how to overcome them.
As compute-heavy technologies such as AI proliferate, the demand for data centers to support their operations is escalating. A recent McKinsey report highlights that the U.S. needs more than 50 gigawatts (GW) of data center capacity by 2030, an infrastructure expansion that would cost more than $500 billion.
As new data centers pop up and operators replace existing infrastructure with more power-intensive processing units, the demand for energy in this sector is expanding rapidly. In 2022, global data centers consumed approximately 460 terawatt-hours (TWh) of electricity, accounting for about 2% of worldwide electricity usage. Projections indicate that this consumption could more than double to over 1,000 TWh by 2026, underscoring the pressing need for sustainable energy solutions.
That spike in demand would be the same as adding the power consumption of an entire European country, such as Germany. In Ireland, data centers could account for 32 percent of all power consumption due to a high number of new builds planned by 2026.
A dramatic uptick in demand for power to support data centers is clearly emerging around the world—and posing several challenges for power providers as they try to meet those needs. Here are a few common issues data centers face as they expand operations while the public grid struggles to build out the needed power infrastructure.
Throttling is the intentional reduction of power consumption or computing performance to prevent system overload, overheating, or excessive energy use. It can occur at various levels, including:
Power Throttling
Data centers may reduce power consumption during peak demand periods due to infrastructure limitations, high energy costs, or demand response programs. Because it prevents components from exceeding their limitations, power throttling can help data centers ensure the stability of their systems.
CPU/GPU Throttling
When cooling systems cannot keep up, servers automatically lower processing speeds to avoid overheating. This safety mechanism helps prevent damage to CPUs and GPUs, but it may also interrupt or delay data processing and affect performance.
Network Throttling
Network throttling involves intentionally reducing the bandwidth to regulate traffic and prevent congestion. This can happen at the local network level or by an internet service provider (ISP).
While throttling is often a necessary precaution, it can cause several issues for data center operations:
An adequate, regulated power supply greatly reduces the need for throttling.
So-called bad harmonics occur when non-linear electrical loads distort the standard sinusoidal waveform of alternating current (AC). Excessive energy demand from data centers can lead to bad harmonics. The deviant wave patterns can then flow into surrounding homes and businesses in the community, causing issues such as:
The best ways for data centers to avoid creating bad harmonics include load balancing, analyzing power quality regularly, using harmonic-reducing equipment, and incorporating passive and active harmonic filters.
The future of data centers remains uncertain, even for large companies like Microsoft and Amazon, which have halted projects due to concerns about the lack of power or infrastructure. There is a great need for new data centers and better Tier 3 and Tier 4 data centers to accommodate the pace of new technologies—but until solutions to the looming power crisis are found and pursued, data centers can only scale in a limited way.
Higher energy consumption often correlates with increased carbon emissions, especially in regions reliant on fossil fuels, posing environmental concerns. As a result, governments are starting to place energy usage limitations around data centers. In the EU, recent government regulations introduced new carbon emissions restrictions for data centers larger than 500kW. These restrictions, coupled with corporate sustainability goals and benchmarks, pose another hurdle for companies that want to build data centers.
While data centers face significant challenges around power access and usage, many are implementing mitigation methods to keep things running smoothly. Let’s look at a few of these solutions:
Liquid cooling and other innovations can prevent overheating that triggers CPU/GPU throttling. A few solutions in this sphere include direct-to-chip liquid cooling, which applies cold plates to components that generate heat, and immersion cooling, an effective approach for high-density computing where servers are submerged in dielectric fluid to cool them down. Rear-door heat exchangers also use liquid to absorb heat from exhaust air around the server racks. Other cooling systems for data centers include advanced air cooling, airflow management, and free cooling, which rely on outside air and water when the center is located in a cold environment.
While these methods improve the efficiency of cooling systems, they do not significantly reduce overall energy consumption. As computing workloads increase, especially with the rise of AI and high-density computing, the total energy demand continues to grow, outpacing the savings achieved through improved cooling techniques.
AI-driven energy optimization can balance loads and reduce unnecessary throttling. Optimization requires real-time monitoring of power consumption and automatic adjustment of power distribution based on pre-defined triggers and policies. These criteria might include power capping and dynamic voltage and frequency scaling (DVFS). Intelligent power management systems can also provide data-driven insights such as predictive analytics to help operators optimize capacity planning and improve efficiency. Ultimately, a successful management system helps to boost reliability and minimize downtime.
An intelligent power management system can also help data centers achieve better load balancing. Distributing workloads efficiently across multiple servers can prevent any single machine from being overloaded. Power management software acts as a traffic director, applying AI-powered algorithms to quickly determine the best way to distribute traffic between servers at a given moment. Constant usage monitoring and load-balancing efforts can help data centers optimize their resource utilization and performance.
A type of small nuclear reactor, the SMR may be an option to provide reliable and cleaner onsite energy sources for data centers. Microsoft and Amazon, among others, are harnessing nuclear energy to supply power to data centers in the eastern U.S. Nuclear power is sustainable and generates no carbon emissions, and SMRs promise to provide reliable baseload power to meet the growing energy demands from data centers. Because of their smaller size and built-in safety features, SMRs pose fewer safety risks than traditional nuclear reactors, but adopting the technology may still prove challenging because of regulatory compliance concerns, public perceptions about nuclear energy, and the issue of waste disposal.
Data centers are also incorporating energy-efficient components that reduce power consumption. These include low-power servers designed to balance performance with energy usage and programmable data planes that allow for more efficient data processing, and systems that re-use waste heat. Despite these improvements, the overall increase in data processing and storage needs often leads to a net rise in total energy consumption.
Many data centers already use renewable energy sources, such as on-site solar installations. They may also have partnerships with energy providers to purchase power from renewable sources, reducing their consumption. While these sources help data centers move toward more sustainable operations, they typically only provide a fraction of the total energy data centers require.
While IBM reported that 74% of the electricity consumed in its data centers came from renewable sources in 2023, the International Energy Agency noted the overall proportion of power generated by renewable sources remains relatively small, accounting for just 5% of the total electricity mix.
Solutions like e2Companies’ Virtual Utility can help mitigate throttling risks, ensuring a stable and scalable power supply. Traditional uninterruptible power supplies (UPS) provide instant backup power to data halls but are incompatible with throttling and the increased power density required for liquid-cooled chips in an AI-driven, high-compute environment. The R3Di® system is a state-of-the-art solution for data centers looking to achieve energy independence.
It locates the UPS outside of data halls to maximize liquid-cooled AI chips and space to enhance processing capacity and instantaneously meet volatile power demand. For data centers that must meet the hyper-dynamic processing ability and speed of next-generation AI chips such as those from Nvidia, AMD, and Intel, e2 provides a state-of-the-art marketplace solution.
Beyond just solving today’s power constraints, the R3Di® System future-proofs data centers, providing grid independence, instant scalability, and 24-hour monitoring and energy optimization.
As data centers scramble for solutions, only those that rethink energy at its core will stay competitive and operational.
Are you ready to take control of your energy future? Let’s talk. Schedule a discovery call today.