Skip to content
InTechnology Podcast

#27 – What That Means with Camille: Carbon Neutral Computing

In this episode of What That Means, Camille dives into the topic of carbon neutral computing with Director of Strategy and Business Development in the Server Group at Intel, John Miranda.


The conversation is a fascinating one, and covers things like:

•  The three R’s of sustainable computing


•  The unpredictability of supply and demand + the problems it poses for renewable energy


•  Why you can’t store renewable energy


•  What some companies are doing to try to reduce their energy footprints


•  How a carbon-aware laptop might operate to use more green energy and less fossil fuel


•  How time shifting can help with sustainability, while saving companies and consumers money


• Space shifting


…and a lot more. This is one fascinating convo you don’t want to miss. Have a listen!


Here are some key take-aways:

•  In the future, our laptops and other devices in the home may be carbon-aware and able to assertively reduce fossil-fuel use and increase renewable energy use.


•  One of the challenges in sustainable compute is that with renewable energy, there’s less consistency and predictability in terms of supply and demand.


•  At scale, it’s too expensive to store renewable energy. But we can use time shifting and space shifting to ‘chase’ the sun and wind.


•  The three R’s of sustainable compute: reduce, reuse, recycle.


Some interesting quotes from today’s episode:

•  “Because at the end of the day, it’s not how much energy compute requires; it’s how much fossil energy does compute burn. ‘Cause that’s what’s creating the carbon footprint.” 


•  “As you make the grid substations more kind of intelligent, if you will, you can start forecasting weather conditions, and you can start empowering IoT devices, data centers, and so on and so forth, to understand what are the upcoming energy conditions — where they can then optimize their operations accordingly.”


•  “It’s use it or lose it. If you can’t generate demand, the grid cannot accept energy.”


•  “Imagine the idea that you can get paid to use power at certain times of the day. So now, if you can make your operations more agile and carbon-aware, it can translate into an OPEX savings.”

Share on social:


Camille: Hello, everybody. On today’s show we are going to be discussing Sustainable Computing–and carbon neutral is one element of this. Joining us today is John Miranda. He’s Director of Strategy and Business Development in the Server Group at Intel. And he has a background in computer science and business.
He’s actually a guest lecture at a University of Arizona Eller Business School. And his job is to understand and track market and technology trends that will affect the high-tech and compute ecosystem, to uncover strategic opportunities and gaps. And then he works with Intel’s executive leadership to shape those investments.
So John, we would love to have you give us a definition in under three minutes of what sustainable computing is.

John M: Sure. And thank you for having me. So when we think about sustainable computing, the way we frame the conversation is we’re thinking about the three Rs, which are reduce, reuse, and recycle.
So reduce, the examples there are compute that reduces the amount of carbon footprint that’s used to run its operation, its compute, right? It uses electricity or, or you reduce the amount of water. So it’s reducing what’s used to support computation and the industry.
Reuse is the notion that you’re trying to extend the life of the things, you know, there’s embodied carbon in producing a laptop or producing something. So if you can extend the life of that device and give it a secondary life, then you’re also making compute more sustainable because that asset is performing longer for you.
And then at the end of the life is to recycle. It’s thinking about things like a circular supply chain; its thinking about things like removing toxic ingredients from motherboards and compute equipment that makes it harder and adds friction to being able to recycle and disposition those assets and make them be part of–kind of thinking about circular supply chain–that that end of life can, can be, uh, new materials for some new part, let’s say. So in terms of definition, those are the kind of the three camps that we broke it down into.

Camille: So reduce, reuse, recycle the same three things that are written on my, uh, Trader Joe’s grocery bag is how we’re breaking down the compute sustainability industry. I like it. I can, I can rock that.

Let’s dive a little deeper. Can you paint us a landscape of what’s going on in the energy space right now, before we think about sustainability within it?
John M: Yeah. So, you know, there’s projections that a material amount of the total global energy requirement will be to drive compute across the world. And that footprint is growing. Many of the largest technology companies are starting to realize that we need to flatten that curve to use a current term and think about ways on how can we reduce that energy footprint.
On the plus side what’s happening is there’s a transformation happening in terms of the electrification of the world. So if you think about, you know, Tesla’s recent success and, and, and innovation in battery technology and the amount of renewable energy being introduced and the dropping prices of solar and so on. What’s happening in the energy world is we went from a world where you had relatively few large power generating plants that produce the electricity. The electrons go one way and then we consume it on the other end, our homes and businesses.
But increasingly it’s becoming a distributed model where we’re all harvesting our energy with the solar panel on our roof or whatnot. We can store the energy. If you have a Tesla and it’s plugged in to your garage, it’s part of the energy grid at that moment. Right? So, so you, we have batteries, we can start storing it. And then of course we can consume it. But one of the challenges that’s happening is the traditional power plants, they produce energy in a very, even in predictable way; whereas renewable energy, it pops up when the sun shines or the wind, the wind blows, right and then it goes away. So you have this kind of variability.
And one of the challenges that we’re starting to see globally is the fact that the energy grid has to keep equilibrium between supply and demand all the time. They have to be equal. So what happens when all of a sudden the sun is shining, but there isn’t demand for that energy. We’re essentially having to strand the energy. It’s essentially unplugged that solar panel because the grid cannot accept it.

Camille: And there’s no way to, can you not store energy in a battery with the grid?

John M: So at scale, it’s too expensive to store the energy that we’re talking about, let’s say nationally in the United States alone, it’s, it’d be over a trillion dollars of lithium ion batteries. So it’s way too expensive. Now batteries can chip away at it, but it’s not enough.
So you’re thinking on one side, you got the compute using more and more and more power and on the other end, it’s getting harder and harder to integrate more renewable under the grid. And so can you solve two problems at once? And that scenario we’ve been thinking about, and it turns out you can.
So for example, in a data center, there’s this notion of time shifting. So figure out what workloads don’t have to run right away, and you defer them to 10 in the morning when the sun is strong, solar panels are producing, but the air conditioners aren’t yet running. So you can actually provide a service to the grid by adjusting when you run those workloads that can be deferred or brought in for that matter.
Some things you got to run right away, but when you’re training a large AI model, that might take two weeks, you can choose when to really push your hardware harder. So the continuum of everything around our lives can start becoming more carbon aware, which in turn will allow greater introduction of renewable energy into our grid and sustainable energy as a result. Because at the end of the day, it’s not how much energy compute requires, it’s how much fossil energy does compute burn. Right? Cause that’s, what’s creating the, the carbon footprint.
One other toolkit for energy grid operators is to motivate us on time of day. Right. So I’ll give you a better rate if you use your power at 10 in the morning. And so we’re seeing California, for example, refine time of use. Incentives and in, in the EU, which has invested heavily on renewable energy, Germany, UK, other countries, they’ve actually allowed it– energy prices have gone negative. So imagine being told, “if you use electricity tomorrow, 10:00 AM, we’re actually going to pay you.”
Now, imagine you run a data center where your largest opex or your manufacturing plant, where you’re very energy intensive. Imagine the idea that you can get paid to use power at certain times of the day. So now, if you can make your operations more agile and carbon aware, it can translate into an OPEX savings.

Camille: That is very interesting and very cool. And then on the other hand, I’m, I’m weighing the concern that the giant companies that can afford to, you know, shift their workloads can take advantage of that, but a consumer who really doesn’t have too much choice as to, you know, when the lights are on in their house or when they need to run the heat is kind of stuck with the higher bill. So how how’s the industry considering that?

John M: You know, at the consumer level, when time of rate programs are introduced as they are in certain parts of the country, it actually empowers the consumer to save money. Like I, I, you know, when I run my washer and dryer, whether I run it at certain time or another time, I just as soon achieve the savings, if they can. If someone has a pool pump, um, they can adjust it. So there there’s a lot of adjustments we can do.
We’re kind of doing it manually today, but I imagine as you fast forward and think five, 10 years from now, this is becoming a bigger and bigger problem in any geography that’s investing in renewable energy. The more they do, the harder it gets to put that asset to work because of this variability and imbalance. I’m envisioning as IOT proliferates and our homes become smarter, you know, they’ll have this notion of carbon aware and you can start letting the consumer choose when to use power, to achieve savings overall, you know. adjusting a lot. I can preheat my house. Earlier in the day when it’s more renewable powered and then kind of relaxed the heater later on or air conditioner, vice versa.
Um, cool, cool. The house a little extra when it’s green energy and then back off when it’s not, it’s an example.

Camille: What kind of changes or evolutions need to take place–I’m thinking, you know, software and hardware or platforms that are negotiating this whole, I related, you mentioned distributed systems or distributed energy and of course I go to, you know, distributed ledger technology, or blockchain. Our hardware right now, a lot of it is designed to optimize a balance between performance and energy consumption, so we still have battery life on our laptops and things like that.
What are the hardware and software changes needed in compute?

John M: As we were starting to harvest, store, and consume energy in a distributed way, the grid substations today are run by a bespoke boxes. They’re very customized boxes that perform different functions on regulating that region of the grid, but we can envision them becoming software defined with more intelligent analytics layered on top of it that allows optimization of all these, these kind of energy flows that are going every, which way.
As you make the grid substations more kind of intelligent, if you will, you can start forecasting weather conditions and you can start empowering IOT devices, data centers, and so on and so forth, to understand what are the upcoming energy conditions, where they can then optimize their operations accordingly.
But there’s example surfacing. For example, if you look at Google, they showcase already applying concepts of what they call “time shifting” in their data center. So they’re introducing in their orchestration software, the ability to delineate which workloads need to run right away, which ones can wait; and the ones that wait, they’re running them when does their source of input power is at its greenest. And they’re already achieving a significant translation of making their data center being powered by increasingly renewable energy.
They’re also introducing, and they’re talking about the notion of “space shifting.” If you have two data centers and you have replicated data and the sun is still shining here, but not there you can direct the workload to run in the data center that’s currently powered by most renewable energy. So you chase the sun or you chase the wind. So those are examples in the data center. Um, on the client—

Camille: Client being like a computer or a laptop?

John M: Yeah. Yeah. Thanks for the clarification. On the laptop, for example, a future laptop. If it’s carbon aware, may know to charge the battery fully at a certain time of day, or maybe when it knows that the green energy is coming from your roof, t that point use it and be assertive about using it. It’s use it or lose it. If you can’t generate demand, the grid cannot accept energy. It’s literally shunted. So maybe your laptop runs full performance flat-out, charges the battery, whatever, because that energy is coming from the sun anyways. And at 5:00 PM, if the user wants to set their machine accordingly, it’s more optimized for efficiency. Where are you still got a good experience, but maybe it’s thinking more in terms of efficiency versus raw performance, as an example.

Camille: That’s very interesting. So you’ve mentioned sun and wind, and I know that there’s probably more discussion and controversy over, um, hydro-power because a lot of times to date that’s built on dams and nuclear power because there could be issues with that.
And I just always had this question. So I’m going to ask you: what about the ocean or, you know, tidal power or even, yeah, just the ocean for cooling? Are we using the–what does it cover? 70% of our earth. Are we using that in any way to help with this?

John M: There’s been a couple of examples and there’s a company out there that’s showcasing using ocean water to cool IT equipment. So they’re basically putting a server on a boat. I believe Microsoft dunked a data center, like a capsule under the North Sea, (or I think it was the North Sea) and let the kind of that cold water cool equipment. So those kinds of models are being tested.
Another one that I think is interesting is around heat reuse. So if you think about a data center, you can have 20, 30 megawatts of power going into the data center. And yes, it’s doing compute, but beyond that, it’s just this one big, low-grade heater. All it does is emit that heat energy, which is a resource that’s being wasted. It vapes it into the sky, through the air conditioning process. Right?
Well, what if you can start recapturing that heat and doing something with it. So as chip temperatures rise, at some point, we’re going to see the introduction and expansion of liquid cooling taking place in the data center. And it can be cold plate where the water flows on top of the chip but it doesn’t come into contact with the chip–or the refrigerant doesn’t have to be water. Or it could be immersion where you take the whole rack and you dunk it in a dielectric, and then you pump the liquid, which then removes the heat much more efficiently than air.
Well now you have this hot liquid. And so the question is, can you do something with this heat energy? And as, as you get the liquid past 50 and towards 60 Celsius, if you can get the liquid that hot, all of a sudden the possibility of things you can do with it really starts opening up. There are some nascent examples around the world, U S Department of Energy, outside of Denver, has a supercomputer. They’re using liquid cooling, it’s heating their labs. IBM, Cedric. Worked on a project. I believe in Munich, they’re heating various apartments and flats. So you’re recovering that, that energy and reapplying it.
There’s also a clever startup in Paris that is doing distributed high performance computing.
They’re putting these elegant furniture-looking pieces in, in individual flats. And then you channel the amount and intensity of workload based on that person setting the temperature in their apartment. So if they’re called and they dial it up, more work gets sent into that unit. They actually rendered the movie Minions2 and heated a couple of thousand apartments in the process. So, so this is high performance computing. I talked to them briefly. I should have asked the question. I did not ask the question. What happens in the summer? I mean, obviously these use cases lend themselves more to colder climates or climates where you’d have cooler temperatures, um, around the year.
But there’s also, you know, technology working the other direction. If you can get the water or liquid, even hotter–the challenge is you’ve got to get it towards 70 Celsius–you can start enabling absorption chilling, which enables air conditioning, or there some startups exploring if you can get 70 plus Celsius, you can start converting it to electricity.
So there’s a gap that needs to be closed. How hot can you get that exhaust liquid? and how low can those technologies that convert that energy into something useful get before they can take that power and use it effectively?

Camille: So to me, this really just feels like kind of blue sky thinking and a whole bunch of innovation and people who are interested in all kinds of things and making things work and figuring it out. Are there any major arguments in the space right now?

John M: Well, yeah, one could be regulatory. So if you’re a large data center–a CSP or something–it’s not lost on, on some of these largest companies that their energy footprint is growing enormously. There was an example, I think it was last year in 2019, Amsterdam– which is a hub for data centers in the EU–they put the brakes on new data center construction. And they had certain things that were concerning them. And one was, we want to see new projects start figuring out how to reuse that heat energy. And by the way, is that going to spread? Is that a nascence signal or, or not? I don’t know yet, but you got to start thinking about those things. Where’s your industry headed in the next five to 10 years?
The agitation on climate is rising in concern and regulatory pressure. So I think we’re going to see more actions. And we’re seeing more declarations. You can look at Facebook, you can look at Microsoft, you can look at Google, you can look at all the biggest players, Amazon; they’re increasingly putting more and more focus and, and more, um, intentional declarations on what they want to do. Microsoft wants to be carbon negative within 10 years. Amazon, Jeff Bezos just declared $2 billion innovation fund for carbon.
So I think we’re seeing a tipping point in terms of the pursuit and making some gains here that we didn’t see two, three, four years ago, I don’t think.

Camille: Very interesting. Um, I really want to thank you for your time, John. This has been just a fascinating conversation.
John M: Thank you for having me. This went too fast! I hope, uh, folks enjoyed it and, uh, that there’s a couple of nuggets of insight that came out.

Camille: So yeah, I think we might have to do another one and dive in.

John M: Thank you.

Camille: We are going to dissect more terms in the weeks ahead and for more discussions about cyber security, of course, be sure to catch the next episode of Cyber Security Inside, which is coming your way next week.

More From