Skip to content
InTechnology Podcast

Uncovering Insights from Listeners’ Top Voices in Sustainability 2023 (182)

In this episode of InTechnology, Camille explores the most popular sustainability topics among our listeners in 2023. Starting off the roundup is Camille’s conversation with Asim Hussain, Director of Green Software and Ecosystems at Intel, where they unravel the mysteries of green software. Next up is Electricity Mapping, featuring Olivier Corradi, founder and CEO of Electricity Maps. Wrapping up this insightful roundup is Energy Efficiency in the Cloud, featuring Lily Looi, an Intel Fellow and the Chief Power Architect of Intel’s Xeon product line.

To find the transcription of this podcast, scroll to the bottom of the page.

To find more episodes of InTechnology, visit our homepage. To read more about cybersecurity, sustainability, and technology topics, visit our blog.

The views and opinions expressed are those of the guests and author and do not necessarily reflect the official policy or position of Intel Corporation.

Follow our hosts Tom Garrison @tommgarrison and Camille @morhardt.

Learn more about Intel Cybersecurity and the Intel Compute Life Cycle (CLA).

What That Means With Camille: Green Software (Ep 137)

When it comes to green software, Asim Hussain believes there are three ways to reduce carbon emissions. The first is energy efficiency which emphasizes the need to understand and reduce electricity consumption.  The second is hardware efficiency, which deals with embodied carbon in devices. The goal is to make devices that last longer and software that can run on older hardware, reducing the need for constant upgrades and cutting down on waste. 

Finally, there is carbon awareness, which involves creating applications that adjust their intensity based on the cleanliness of the energy source. Asim says this is one of the easiest ways for organizations to explore green software as it deals with using more renewable energy sources rather than re-architecting an application.

Listen to the full episode (EP 137) – WTM: Green Software with Asim Hussain.

How Green Is Your Electricity? | Electricity Maps (EP 147)

According to Olivier Corradi, Electricity Maps seek to answer key questions about the usage of electricity in an area, its source, and associated carbon emissions at a specific time and place. To truly decarbonize the electricity grid, Olivier says that it’s important to use low-carbon energy sources everywhere at all times. However, the intermittency of renewable energy sources like wind and solar makes this challenging. 

Olivier envisions a future where flexible devices, such as dishwashers, can be automated to schedule their operation within periods when renewable energy is available, thereby contributing to a more stable grid. Commercially, electricity mapping can be valuable in potential use cases, like grid-connected batteries in data centers with significant electricity demands. Olivier suggests that, for instance, data centers can act like virtual batteries by scheduling energy-intensive AI tasks within periods where renewable energy is abundant.  

Listen to the full episode (EP 147) with Olivier Corradi – How Green Is Your Electricity?

What That Means With Camille: Energy Efficiency In The Cloud (148)

In this episode, Lily Looi breaks down energy consumption in servers and data centers. She says a third of the power goes to cooling, another goes to powering the servers and CPUs, and the final third is for broader platforms such as networks and storage. This provides different opportunities to optimize consumption in data centers. 

To optimize energy usage for the CPU, Lily suggests turning off unused cores and running the system during light usage at lower frequencies to conserve power. For cooling the CPU and data centers, she recommends using more efficient fans, advanced immersion cooling, better airflow, and smarter air conditioning. 

On a broader view of energy efficiency, many major data centers strategically choose locations abundant in affordable renewable energy. This includes leveraging solar power in sunny regions and scheduling work based on high-renewable energy availability periods as Lily points out. 

Listen to the full episode (EP 148) – WTM: Energy Efficiency In The Cloud with Lily Looi.

Asim Hussain, Intel Director of Green Software and Ecosystems

Asim Hussain Uncovering Insights from Listeners' Top Voices in Sustainability 2023

Asim has over 20 years of experience in software development and sustainability, preceded by a degree in computer science from the University of Leeds. His exciting career path has included global organizations such as the European Space Agency, JPMorgan Chase, Morgan Stanley, and Microsoft. In 2022, Asim joined the Intel team to lead Intel’s Green Software Strategy and to grow the Green Open Source Ecosystem. Asim also co-founded the Green Software Foundation in 2021 and now serves as Chairperson and Executive Director.

Olivier Corradi, Founder and CEO of Electricity Maps

Olivier Corradi Uncovering Insights from Listeners' Top Voices in Sustainability 2023

Olivier Corradi is the Founder and CEO of Electricity Maps, a company that organizes the world’s electricity data and drives the transition towards truly decarbonized electricity. He formerly worked as a Guest Lecturer at the Technical University of Denmark (DTU). Before this, he worked in notable organizations such as Connected Cars, Google, and IBM Zürich Research Lab. Olivier holds a BSc in Mathematics and Technology and an MSc in Industrial Mathematics Elite Program from the Technical University of Denmark (DTU). He also has an MSc from the CentraleSupélec in Mathematics, Management, Marketing & Economics. 

Lily Looi, Intel Fellow and Chief Power Architect of Intel’s Xeon product line

Lily Looi Uncovering Insights from Listeners' Top Voices in Sustainability 2023

Lily Looi is an Intel Fellow in the Data Platform Group, as well as the Chief Power Architect of Intel’s Xeon product line. She has been with Intel for over 30 years, starting as a Component Design Engineer in 1990 and serving in several engineering roles. Lily has an engineering degree from the University of Michigan and holds over 60 patents.

Share on social:

Facebook
Twitter
LinkedIn
Reddit
Email

Camille Morhardt   00:12

Hi, I’m Camille Morhardt, host of InTechnology podcast.  As we head toward the close of 2023, we’re going to revisit a few of our most popular listener topics.  Today we’re going to talk about sustainability.   We’re gonna look at three of your favorite conversations on this topic this past year. One on green software, one on mapping electricity use and the last one on energy consumption (and reducing energy consumption) in that data center.

Let’s start with green software.  You might think this is a software to make the world more sustainable–for example in helping farmers grow food more sustainably.  There is definitely a role for that.  But green software really focuses on reducing emissions software is responsible for.  

So back in January, I spoke with Asim Hussain Director of Green Software Engineering at Intel  and also co-founder of the Green Software Foundation.  He said factoring in software emissions is a shift for many software developers, but it’s an important one.

Asim Hussain  01:14 

There’s really only three ways of reducing carbon emissions with it comes to software. The first is energy efficiency. For me, this was actually quite a surprise. I’ve been in the software development space for two decades now, and I never really knew how it was made. What are the components of it? How is it bought and sold? How does it differ country by country, which now, I find kind of almost a ridiculous position to be in because it’s fundamental to everything that I do is a consumption of electricity.

You need to understand, if you want to be a green software practitioner, you need to understand what electricity is all about. And the main issue is that electricity is a single biggest emitter of carbon emissions in the world. About 80% of all the world’s electricity is still made through the burning of coal. And we hear a lot about renewables and resources, but most of the world burns coal. And in fact, they burn very dirty coal. We used to burn much cleaner coal, and then about 20 years ago, people discovered that cleaner coal causes acid rain. And so everybody kind of flipped over to burning kind of a much more dirty coal that has much greater emissions.

So, just looking at it from that perspective, we call energy a proxy for carbon because you can draw a straight line between energy and carbon emissions. And therefore, if your goal is to be carbon efficient, you must be also be energy efficient. 

The other angle we call about is called hardware efficiency because there’s also what we call embodied or embedded carbon. So looking at this mobile phone, this emitted carbon, when it was manufactured, all the little components, the case, the chips, all of it, emitted carbon. And it will also emit carbon when it’s disposed of very responsibly. All of that is called the embedded or embodied carbon. If you’re an application developer, what do you do with that information? The carbon is already out there. What’s more responsibility there? 

And then we talk about this idea of hardware efficiency, which is use the least embodied carbon possible, which for most kind of end user devices is all about increasing the lifespan of this device, the usable lifespan of this device. So this is my old phone and I was forced to upgrade this to my new phone not because there’s anything wrong with it. It didn’t break. It’s only three years old, but just the software I needed to use stopped working on it. And that’s called software-driven obsolescence. That’s one angle as a software person, you can kind of make your software run on older hardware and therefore reduce the pressure that’s constantly there to keep on buying new devices.

And there’s a final one, which is called carbon awareness. Somebody said in a really interesting way recently, which I love, energy efficiency is about using less energy and carbon awareness is about using the right energy. So carbon awareness is about how do you create software that does more when the electricity is clean and less when the electricity is dirty.  For instance, I live in a country which has a fairly decent amount of wind power. And if you can build a software which does more when electricity is coming from cleaner sources and then when the wind stops blowing and the sun stops shining, can you make your software do less? And that’s one way in which you can kind of reduce your carbon emissions. 

There’s a huge amount of interest in carbon awareness right now. And one of the reasons people are really excited about it is it’s actually one of the easiest ways for organizations to explore green software because you have to change some things, but it’s much more a decision about when and where you run things rather than you re-architecting an application, which is quite an involved process.

Camille Morhardt   04:50

That was Asim Hussain, Director of Green Software Engineering at Intel, and co-founder of Green Software Foundation.  In our full conversation we also talked about tools to help developers and companies create and adopt green software.  Check out the show notes for the link to that full episode.

We just heard Asim Hussain talk about developers needing to know about electricity consumption. But where do they get that information?  Well, turns out another popular episode from 2023 helps answer that question.  Back in April, co-host Tom Garrison and I spoke with Olivier Corradi, founder and CEO of Electricity Maps.  The company actually organizes electricity data by grid with a mission to drive the transition to a decarbonized electricity system.

As the name of Olivier’s company suggests, the Electricity Maps’ platform “maps” electricity usage all globally.  And we wanted to talk with Olivier because he’s a perfect example of an applied use of software that can actually help people build systems differently and move towards sustainability.

Olivier Corradi   06:04   

The question that we’re answering really is to say, “when you are using electricity at a particular time in a particular place, where did it physically come from? and what are the carbon emissions associated to your consumption at that moment?”  When you click on a particular area, you can then see through a graph how much of the electricity is being produced locally and what these sources are–you know, could be hydro, could be coal, could be gas, could be other things.  It also looks at what are the imports and exports of electricity, because at a particular place, you can either produce as electricity locally or you can get it from neighboring areas, right? And these neighboring areas can also get it from other neighboring areas.  But you can see all this in the map and then all of these sources mixed together. And if you look at the mix of electricity, you can derive from it the carbon emissions that you would cause, essentially, by consuming electricity at a particular time in a particular area. 

Camille Morhardt  07:02

Are you applying this in a kind of predictive kind of matter, so that you can presume which way, which kind of utilization is kicking in at any time for the resource?

Olivier Corradi  07:13 

Yeah, so some of our customers are using it for granular carbon accounting; so we’re looking at the historical information in order to make sense of “what are my carbon emissions from the electricity usage of my data centers or my factories, or other things?” But you are correct. We also have a predictive element to it.  My vision is that, you know, if we want to truly decarbonize the electricity grid, we will need to make sure that we can use low carbon energy sources everywhere at all times. And right now the intermittency problems of wind and solar means that we need to figure out what we do when the wind is not blowing or the sun is not shining.  And so if your dishwasher, for example, it’s playing its part by getting a signal that, you know, in a couple of hours there’s gonna be wind, but right now there’s none. So maybe it should actually wait a couple of hours before it schedules itself. And if we could get that automation level built into any flexible device or appliance, we would really be helping out the grid to deal with this fluctuation of renewable electric. 

Tom Garrison  08:17

I’m just envisioning like a commercial user.  You said before, you’re using the information now just to do accounting sort of in the rearview mirror. Where did the electricity that we used come from?  That makes sense. The future use-case, though, do you see like batteries being used? The batteries during non-wind generation or sun, whatever, you rely on batteries and then when you do have sun and wind, you charge the batteries?  Is that kind of part of your vision or no?

Olivier Corradi  08:49

Yeah, absolutely. So if you look at more impactful use cases such as, you know, grid connected batteries, for example, then obviously you’re gonna step into the next level of actionability on the grid and you’re gonna help the grid on a different level.  Now an example that I like to showcase as well is, look at data centers. Data centers use so much electricity, globally, and they have some flexibility. There’s so many AI jobs that are trained and require so much electricity.  These jobs can be scheduled with some flexibility. And so if your data center is starting to use electricity at the right time, at the right place, what you’re doing essentially is having sort of a virtual battery because you’re avoiding to use electricity a particular time, and you’re displacing that amount of electricity usage to a different time. So in a sense, you’re actually storing that electricity in a virtual sense. 

Camille Morhardt  09:41

That was Olivier Corradi founder and CEO of the company Electricity Maps.

And in this final episode of the 2023 roundup on sustainability, we’re going to look at addressing energy consumption in servers and data centers—where lots of servers get together to make a cloud.  It makes sense there would be a lot of listener interest in the topic.  With the increase in cloud computing since the pandemic, there’s been more in the news about how much electricity data centers consume to operate the cloud.   But add to that the explosion of AI tools and data intensive LMMs that have come about in 2023, those require a huge amount of data center processing power… and a demand for more electricity along with it.  

Lily Looi  09:54 

So 4% of the global electricity is being used by data centers today; and that’s up from 1% from maybe five years ago.  So it’s growing.

Camille Morhardt  10:39

That’s Lily Looi.  She’s an Intel Fellow as well as Chief Power Architect of Intel’s Xeon product line. We spoke back in April about what’s being done to reduce energy consumption in data centers.

Lily Looi  10:51 

Most people that I’ve talked to are in favor of reducing our carbon footprint through more efficient use.  You can optimize the hardware, you can optimize through software; you can optimize on an individual chip or at the platform level or at the whole data center level. There’s many different ways to optimize, and depending on your point of view, you’ll optimize that piece. 

So if you look at a data center, it’s got a big building, they have to cool it, air conditioning, all that; that is about a third of the data center’s power. Another third of the power comes for the many, many servers that are in that data center. The CPU and the cooling of the CPU is another third of the power. And then the final third is for the remainder of the platform–so the networks and the storage and the platform.

Camille Morhardt  11:42

Well take us through the CPU and the cooling of the CPU. Like what is being done on that level to help address that or optimize power consumption?

Lily Looi  11:50 

Okay, sure. So a CPU is very complex. In a data center, each CPU has lots and lots of cores and it has memory interface and IO interface and lots of cache. So it’s, it’s very complex.  And the thing is first, you want it to be running most efficiently while it’s in full operation. And for that, there’s circuit techniques and there’s process techniques; but a lot of time, pieces of it are not being run at full speed or maybe not even being used.  For example, you may not be using every single core all the time. So one of the very common ways to save power is turn it off if you’re not using it. Just like it in your house, if you’re not using your lights in a room, you turn off the light. So that is one power saving method.

Another power saving method is if you’re using it lightly, turn it down. So instead of running it full frequency, maybe run at a lower frequency. Now all of these power saving techniques of course, you have to be careful not to just turn things off and turn things down to the point where it takes a while to turn it back on.   It has to be weighed against whatever kind of performance impact there could be. You want the intelligence to know when it’s okay to turn it off and when it’s not. So like you just left your room for a minute to go grab something and come back. You may wanna leave the light on. 

Camille Morhardt  13:09

Okay. And then, uh, cooling was another third. Do you know how they reduced that? 

Lily Looi  13:16 

So there’s two pieces of cooling. One is the cooling of the CPU, and so that gets lumped into that third. And there are different types of cooling.  Like you could use bigger, more efficient fans that would cool at a lower power or liquid cooling is starting to come out.  Or even, you know, the most extreme of immersion cooling. These are very efficient cooling methods for the CPU. 

And then cooling at the data center level–that would be air conditioning– and that would be having more efficient airflow in the room or a more energy efficient air conditioning. But I have seen data centers where the air flow in the room is actually directed so that the cool air blows through and then out through a vent.

Camille Morhardt  14:02

Well, a lot of the large data centers are located in like physical geographic locations that have inexpensive, like renewable energy.  And that’s one way that they’re trying to–

Lily Looi   

Yeah. Yeah, that’s true. I mean, you’re still using the same amount of electricity, but it’s green electricity. So for example, somewhere with a lot of sun and then have solar. Even on that front, it’s still the same amount of electricity, but it’s not the same carbon that you’re putting in the air.  And then software orchestrating the data center and making more efficient, if software knows the time of day, then it’ll know that, oh, the data centers in this time zone have the most sun right now, so I’ll schedule more work over. 

Camille Morhardt  14:50

… that takes us back to green software, and full circle on this special episode of InTechnology focused on our most popular episode on sustainability in technology.  You just heard from Lily Looi, Intel Fellow and Chief Power Architect of Intel’s Xeon product line.  She had much more to say about tackling energy consumption in data centers, so check out the full episode, if you’re interested. You’ll find links to all three episodes we featured in this one in the show notes. 

I’m Camille Morhardt.  Thanks again for joining me today.  In the coming weeks, be on the look out for more year-end round up episodes.  We’ll be featuring your favorite InTechnology podcast discussions on cyber security as well as artificial intelligence.

 

More From