Skip to content

How Power-Hungry Are Data Centers? | Energy Efficiency in the Cloud | Intel Technology

Play Video

Do you know how much energy data centers consume?

In this What That Means video, Camille talks with Lily Looi, Intel Fellow. They get into just how much energy data centers consume and why energy efficiency in data centers is so crucial.

How Much Energy Data Centers Consume

Data centers consume a sizable portion of the world’s electricity, now up to 4% compared to just 1% only five years ago. This growth is only expected to continue as a result of more powerful AI, cloud computing and cloud storage for managing business, and the many new technologies that rely on data centers to process data. As the demand for data centers grows, so does the energy they consume and the need to make that energy use more efficient.

Lily breaks down data center energy consumption into three main pieces: cooling, powering servers, and running the platforms. Cooling in this case means both cooling the data center buildings with air conditioning, as well as cooling the individual servers. Software also plays a role in how energy-efficient or energy-inefficient servers and platforms run. As data centers need higher and higher processing power, they need more energy to operate.

Why Energy Efficiency in Data Centers Matters

Data centers have become an integral part of modern technology, and their role is only going to increase. Because of this, engineers have to get creative about energy consumption and energy efficiency in data centers. Some ways data centers can already save on electricity include turning servers off when not in use, turning down server power usage when they don’t need to be at full utilization, and load balancing to run servers and data centers when renewable energy is readily available. Cooling innovations in data center buildings and liquid or immersion cooling for servers can also improve energy efficiency.

Thankfully, there are a few benefits to making data centers more energy efficient. It can lead to lower operating costs, improved performance and efficiency in the data center, and a reduced carbon footprint. Reducing the carbon footprint of data centers takes a combination of optimizing energy efficiency in hardware, software, individual chips, platforms, and the entire center. Lily emphasizes that the purpose of these innovations is not to stop data centers from being used but to ensure energy efficiency is prioritized for improved performance and sustainability.

Lily Looi, Intel Fellow

Lily Looi energy efficiency data center cloud computing

Lily Looi is currently an Intel Fellow in the Data Platform Group. She is also the Chief Power Architect of Intel’s Xeon product line. Lily’s history with Intel spans more than three decades, from her first role as a Component Design Engineer, through numerous senior and principal engineering positions, to today. She holds more than 60 patents and earned an engineering degree from the University of Michigan, Ann Arbor.

Check it out. For more information, previous podcasts, and full versions, visit our homepage.

To read more about cybersecurity topics, visit our blog.

#energyefficiency #datacenter #cloudcomputing

The views and opinions expressed are those of the guests and author and do not necessarily reflect the official policy or position of Intel Corporation.

—–

If you are interested in emerging threats, new technologies, or best tips and practices in cybersecurity, please follow the InTechnology podcast on your favorite podcast platforms: Apple Podcast and Spotify.

Follow our hosts Tom Garrison @tommgarrison and Camille @morhardt.

Learn more about Intel Cybersecurity and Intel Compute Life Cycle (CLA).