Environment Variables
Fact Check: Colleen Josephson, Miguel Ponce de Leon & AI Optimization of the Environmental Impact of Software
May 31, 2023
This episode of Fact Check we ask the question, can AI always help us optimise the environmental impact of software? Host Chris Adams is joined by VMWare’s Colleen Josephson and Miguel Ponce de Leon to tackle this from their unique perspectives within the industry. They also talk all things sustainability in virtualization and networking and how this begins with green software. They also give us insight into how VMWare is tackling decarbonization within their own company.
This episode of Fact Check we ask the question, can AI always help us optimise the environmental impact of software? Host Chris Adams is joined by VMWare’s Colleen Josephson and Miguel Ponce de Leon to tackle this from their unique perspectives within the industry. They also talk all things sustainability in virtualization and networking and how this begins with green software. They also give us insight into how VMWare is tackling decarbonization within their own company.

Learn more about our people:

Find out more about the GSF:


If you enjoyed this episode then please either:

Transcript Below:
Miguel Ponce de Leon: And the thing is, we're right in this maelstrom, this tornado of activity that's just got underway and just seeing how they fit together, it's not a perfect fit. I would say there. I couldn't give you, this is exactly the time horizon and this is how it's gonna happen, but I can tell with the level of funding, both from governmental agencies, from companies themselves, from research institutes to lots of public bodies and developers in their own time.

It's a great time to be in and around this space of developing software, but specifically for the delivery of green technologies as we see it.

Chris Adams: Hello, and welcome to Environment Variables, brought to you by the Green Software Foundation. In each episode, we discuss the latest news and events surrounding green software. On our show, you can expect candid conversations with top experts in their field who have a passion for how to reduce the greenhouse gas emissions of software.

I am your host, Chris Adams.

Hello and welcome to Environment Variables, where we bring you the latest news and updates from the world of sustainable software development. I'm your host, Chris Adams, and in this episode we have two very special guests for an episode of fact check on Environmental Variables. From VMware, we have Colleen Josephson, and Miguel Ponce de Leon. Hey guys.

Miguel Ponce de Leon: Hey Chris. Hey.

Chris Adams: Okay, so for our listeners who are unaware, VMware has been a member of the Green Software Foundation since January, 2022. And we'll be talking a little bit about AI and the environmental impact of cloud. And we figured VMware spent a lot of time working in this field. In fact, they're synonymous with virtualization, but they also work in a number of other fields.

One in particular is networking. So this is where our backgrounds come from. Before we dive in, let's just actually give a chance for our guests to introduce themselves properly so they don't, so we know who they are. So I think, Miguel, if you just introduce yourself and what you do, we'll hand over them next to Colleen afterwards.

Miguel Ponce de Leon: Very good. So my name is Miguel Ponce de Leon. I'm director of Distributed Edge Intelligence in the office of the CTO here at VMware. So it's very much looking at research and innovation that happens from cloud. Towards enterprises and towards telecommunications. So networking, which is a topic that we're gonna talk about here, and very much looking at what it means and the impact of having the edge of the network and sustainability and connectivity to the cloud have had an impact on our products and services.

Chris Adams: Thank you, Miguel and Colleen. I'll give you a bit of space to introduce yourself as well. For folks who missed your inaugural podcast last year.

Colleen Josephson: Thank you very much. Yes. I'm Colleen Josephson. I'm a senior research scientist at VMware. I'm on the same team as Miguel, and a bit of news is that I'm actually transitioning to a full-time academic position at UC, Santa Cruz, where I'll be continuing to. To research sustainability, particularly in the space of low power in distributed systems.

Last time I was a guest on the podcast, I was the org lead for VMware, and I'm very pleased to share that Miguel, uh, I've passed on the torch to him, so he's very qualified and excellent, so I'm really excited to be doing this podcast today with him.

Chris Adams: Thank you, Colleen. All right, so for folks who are new to this format, Fact Check is a kind of format we use where we basically take a statement that people. Put into use in discussions around sustainability and software and we basically dive into it a little bit more to actually examine some of the assumptions because a lot of the time it really helps to really understand what some of the nuances behind this.

And today, the fact check statement we are looking at is this one, can AI always help us optimize the environmental impact of software? And Miguel, I might invite you to talk a little bit about this part first actually, because this is something I think you've heard at least one time before.

Miguel Ponce de Leon: More than one time, actually, I just as part of the introduction, just again, to say, look, I've been in and around telecommunication systems over the last 25 odd years from analog systems to two G, 3g, 4g, 5g. I know all these Gs that people hear about, but essentially it is and about looking about optimizing the way the networks are going to be deployed and used in the future, and very much we're seeing huge uptick in the use of artificial intelligence and machine learning to help with this optimization, and it is true to say that it can help, but I think there's a number of factors in order to ensure that what you use for putting the model together, for deploying that AI model and that, that you optimize the network, that you are getting the totality and that end-to-end sustainability and energy usage from all of that.

I think there's a lot of factors that we need to drop into it when we discuss that.

Chris Adams: Okay, so we've got a series of factors here and maybe if we start with one or two dimensions or key things that have an influence here, if we start there, maybe that'll help us frame some of our discussion. So maybe if we start with the first one, what's one of the bigger levers, Colleen, maybe if I hand over to you, if there's one that you would pick, what might, might it be?

Colleen Josephson: Training. Training is a very expensive process nowadays. Whether it's in the telco space, cuz again, VMware, as you hinted, we have a long history of virtualization in cloud, but that also has become very relevant to telecommunications. We need to train the models that we use to make these decisions to try and save energy and the process of having so much data.

And training it. It can be really power consuming. So I think one of the things that stands out to me is what is your anticipated energy savings once you've deployed this model? How long do you anticipate that this model will be good for? And do you need to retrain it? All of those you wanna have some idea of so that you can calculate whether or not it was worth the energy to train this model in the first place.

Miguel, do you have anything to add?

Miguel Ponce de Leon: Yeah, just, oh, I absolutely agree when it comes to, to train the model, but what I'd love to really highlight to everybody, even here listening is that these systems are being more open for everybody to get involved, more developers to, to get involved. Again, I talked about all those Gs before, but they were very much closed to certain vendors and certain companies that built it.

We now have Open-RAN this open radio access network, which means that we can use. More AI models and people's training of those models within systems and deploy them more readily. But it also means that we have to have newer understandings about, again, when we talk about wireless, we talk about power amplifiers.

We talk about controlling power amplifiers with AI models, and again, how it is that we're going to make sure that we efficiently train those systems in an energy efficient way before we even talk about turning on and off those radio head ends to save energy more often than not to save it so that, again, when we don't have so many people within an area, and therefore the radio doesn't need to be pumping out that signal the whole time.

Again, immediately you can see what the benefit is. But if we end up spending so much energy trying to train the model in the first place, then again, have we achieved our actual overall goal?

Colleen Josephson: For the listeners tuning in who might be less familiar with the structure of wireless networks, it might be worth giving a little bit of background on this power consumption. For telecommunications specifically, the biggest energy challenges are the radio access network or RAN, as Miguel was just talking about, and also data centers.

And the way that I like to think of it is that wireless communication is basically shouting energy into the void at high enough power so that the receiver can decode it. It's inherently consuming a lot of energy. So a lot of the challenge in this space is trying to look at how can we use various tactics, whether they're AI informed or not, to save power for this radio access network.

And also increasingly data centers.

Chris Adams: So there's one thing that you mentioned there, Colleen. So you spoke about most of the kind of areas where people are focusing on are either the data center or the radio access network, but not really the kind of pipes between data centers or the kind of backbone. Is that one thing there where there's already a fair degree of efficiency or where you don't see that much change at that part?

Colleen Josephson: I think the pipes between data centers that's inclusive of data centers. I'm sorry if I was not clear of that. Yes, so that the kind of the inter data center communication is definitely a part of the energy consumption end to end system that we need to consider.

Chris Adams: Okay, but there's a chunk about radio, but there's one area which is relatively new to us or relatively new, like this radio access network where cuz we've seen so much growth in cellular and things like that, and without growing so fast. This has been one thing that we are now looking to use tools like AI or something to make them work a little bit more efficiently than what people would manually be switching on things on and off or what, what, maybe we could explore some of the levers you actually have here, because it's not clear to me why you might be using AI in the first place for this part specifically.

Miguel Ponce de Leon: I might say a little something about that, Chris, because again, what you're gonna find with radio, what you're finding today with 5G, and you'll find it in the future as well, is that there will be more aerials. They will have smaller power outputs, but there'll be more of them. And with more of them, it means that you have to network more of them.

And with networking, that many nodes, you're going to have a, an optimization system in order to decide where to place them, when to place them, and when to leave them on and when to actually turn them off. Because if you have less of them, what you're actually gonna do is you're more or less, what's happening is you're leaving the actual radio head ends and the amplifiers, you're leaving them on constantly sucking up all that energy in juice. Now what we have is we have more aerials, lesser coverage, but because you have more, essentially it's more of a complex item than having a, a couple of network engineers monitor them themselves monitoring the issues as they potentially break or they have to be modified.

So it's a much wider range of, let's say, input variables that you actually have to, you cover off on.

Colleen Josephson: Just adding to your comment, Chris, earlier and Miguel's, absolutely right. We have a lot of input Variables, but. You were talking about kind of data center, network consumption and kind of the back haul, and I think I wanna revisit this topic of radio access network and wireless communication. And it's gonna get a little bit down into the physics of it, but if you have a wire going from one end to the other, fiber optic, those have much less loss.

So you can consume much more rapidly, get much more throughput for the same power consumption with a wired network than compared to a wireless network. With wireless you have all sorts of types of loss. The channel conditions are changing drastically all the time. So this wireless aspect is really one of the things that makes it, the radio access part of the network's a higher power consumer because you have this signal that you're sending out into the air as opposed to this little cable that has much more controlled conditions, you're gonna see something they call path loss. When you're going from the transmitter to the receiver, you're gonna see something called multi-path, which is where there are multiple copies of the transmitted signal arriving at the receiver. And these wireless networks have to be designed to overcome some of these challenges, and that's where a lot of kind of the radio access network consuming more power comes from, if that makes sense.

Chris Adams: I think that does. So if I was to maybe take a step back for folks who might be familiar with 3G and 4g, when we talk about some of this, if that was a model where you have one or two very large kind of transmitters or receivers, that's a shift to 5G or possibly even six G is many more smaller ones possibly with this.

And as a result, you have a kind of explosion of complexity. That's the, that's the thing that you have to manage, that you didn't have to manage before. And maybe the, this other thing I should ask you about then is that, it sounds really basic, but one of the ways when people talk about, say, 5G or six G being potentially more efficient or greener is just because it's easier to turn some of the system off rather than just having things blasting the entire time, 24 7. That's one of the assumptions that you're looking at?

Miguel Ponce de Leon: That's exactly it Chris. And just to explain the complexity as well, there is the possibilities. We're looking at them again with next G and six G systems. We'll, we're basically, there'll be a small aerial connected to your home, for example. But again, what you'll want to do is making sure that is controlled efficiently.

That it's looked after optimally. So for us, all of that is causing this need to self-organize the network in some way. That's the way we use terminology. There's a self-organization of where you put the frequencies on each one of the antennas and things like that,

Chris Adams: Ah, okay.

Miguel Ponce de Leon: having that kind of control remotely can be interesting and it could be.

For example, that your home aerial today could be with operator number one, we won't name the operators, but you can keep in mind that most, in most countries we have 2, 3, 4 operators in country, but an operator could pay you for that aerial that you've put on your home and could be optimizing it for your six G signal tomorrow.

It could be operator B that's using your aerial in some way, shape or form, and they may have a different way of optimizing. That same aerial for the signal that it's producing and sending out there and paying for your time to use it. So again, that's why we see a much more open system, a kind of an open ecosystem of how telecommunications will be actually provided in the future.

Colleen Josephson: And adding to that, there's some really great opportunities for, I like to talk about the power of open systems and virtualization, software definition, rapid prototyping. This really gives us an opportunity for that that we haven't had yet quite in this space of the traditional monolithic vendor stacks.

By opening it up to more members of the ecosystem, people can prototype these new clever applications, whether they be AI based or otherwise, and deploy them and get much more quick feedback on how much energy are we saving by trying this and rapidly iterate on those sorts of developments.

Chris Adams: Ah, okay. I'm really glad you brought it back to the AI part here for the initial fact check. So basically, faced with all this complexity, there's been this assumption that if there's something complex, we're just gonna throw AI at it. Like how Google did when they bought Deep Mind and then threw AI at their own data centers to twiddle the knobs instead of actually having humans do this.

And it turns out that maybe that isn't the best way because there's a significant impact. From actually training in the first place, and that may be larger than the inference in this case, for example. That's where the complexity lies in some of this, by the sounds of it.

Colleen Josephson: Yeah, definitely.

Chris Adams: I think that helps break down some of this actually.

The idea being that, yeah, there is an inference that we need to be thinking about and a training part that we need to be aware of. And if you spend your entire energy budget on the training part, then you need a huge amount of use to make up for that saving that you might actually have. And that's something we're not quite sure whether we'd actually see here.

And you also mentioned something that was quite interesting Colleen about a kind of decomposition of what might have been a kind of quite monolithic stack into a larger number of small look moving components, for example. So rather than just having one vertically integrated system, there might be a number of different players involved, or some of their work might be done in a cloud somewhere else or something like that?

Miguel Ponce de Leon: Yeah, so I will have to say a little something about how the networks are changing towards cloud native, right? So how communication service providers, or telcos, as we might call them as well, are moving towards more, and I'll say something, technologies like containerization, containerizing, the software within those containers, they're actually developed by different entities, by different developers, by different companies.

And the integration of them all still provide you with an end-to-end telecommunication system. And what I think is really interesting here, For, and especially for the Green, Software Foundation is around the carbon impact of the software that's developed and how that's tested in a CICD a continuous integration, continuous deployment development environment to see the impact of the overall delivery of these individual players who have been plugged together to provide a communication system as we go towards next G. And this is something, again, a lot of research is going into certainly around communication service providers in Europe, they're saying, okay, I can get vendor A, B, and C. I'll plug them together, I'll put them in a containerized orchestration environment.

But they're also asking their question, not just performance, not just security, but what is the sustainability and energy impact? And if B is not written in an efficient software way. The software way is not energy efficient and not secure. Well, I'm afraid vendor B will have to pop out and an alternative to that is, is being looked at how you pop that in.

And again, all the factors around performance, security, and sustainability are important factors with these products actually going online. So again, that's the type of research that we're seeing happening at this moment in time is how do you do that? How do you measure the baseline around that, especially in a cloud native world?

How do you get the baseline and then how do you take actions Because now that it's so pluggable and playable, like I give the example, I can put A, B, and C, but now I can take out B and put a replacement in. We really have to be cognizant of the delivery of, of that service too.

Chris Adams: Okay, so it sounds like you're implying there are for want of a better term, as long as you are honoring a particular contract of an API. The idea would be that if you have a stack of technology, here's a chance to swap out one part of your stack to make, to replace it with a greener part of your stack.

And hopefully that will result in a kind of more diverse, healthy ecosystem that you'd be working with here, for example, where there's a kind of chance we would compete on transparency and compete on sustainability in this instead of just actually on performance and cost, for example. Cuz in many cases that's the world we we are living in now, right?

Miguel Ponce de Leon: Big time. Big time. And I do know of operators that now are looking at can they provide their clients with a energy sustainable service? Like they'll allow their customers to actually choose. So you can choose your service and it has A, B, and C, or you can choose your service with A, Z and C. One is more green.

One is using energy in a more efficient way. And is that what your company decides to do? Then they can offer it in that way. So that's also what I see from a research perspective, what we're seeing at companies and entities around the communication space looking to, to address. But I know Colleen has some examples too, around this, right?

Colleen Josephson: Yeah, so bringing back the data center thread, one thing that I thought was really interesting in some of the work that VMware has contributed to is that we saw the data center portion of a network's power consumption double between 4G and 5g, and we're expecting that trend to continue and become even more pronounced going into six G.

So that's why we say that the two big things to think about are the radio access network and also the data centers. And that brings to mind one of our data center success stories where it comes to AI and energy savings. One big source of emissions is power that's drawn for data center cooling in particular.

And to tackle this, we've actually partnered with Intel and a company called Clark Data on a solution called Deep Cooling. It uses big data and AI to model various physical parameters in large data centers. Things like power load heating, required cooling, and it uses insights gained from this modeling to predict the results of changing computing workloads and then automatically adjusts the equipment parameters to optimize the system cooling, and it's implemented right now in several large Chinese data centers, and it's been effective at helping customers significantly improve power usage effectiveness, and reduce carbon emissions. I think the figures I have here saving 18 to 25% of electricity for cooling.

This is one example of kind of an AI success story. But again, you always have to think about when I go to train this model, when I go to use it, what is the story going to be when we consider end to end, not just the immediate usage.

Chris Adams: So this maybe might be a chance to talk about some of the metrics you might use with this then, because one thing we spoke about in a previous episode was that there are various researchers who are talking about AI and they're talking about. The idea that you might attract the energy embedded into that model.

Just the way we talk about embedded energy in, say, building physical hardware, there might be an idea of like energy embedded into a model before you actually use it, for example, as ways of listing this stuff. And when people talk about that, people talk about, say like the energy usage, but people also talk about the carbon impact of that part as well.

And this kind of speaks to the idea there's maybe another lever, not just energy itself. Is that maybe something you might wanna talk a little bit? More about the fact that it's not just energy, it's the kind of energy or how green or dirty the energy might be, or what levers you actually have there to affect that.

Colleen Josephson: Yeah, that sounds like it's getting a bit into some of the green load balancing or carbon aware workload migration that we talked about last time, and I'm pleased to share that our work on that has been progressing. It's still very much in the research phase. There's not much new that we can publicly share yet.

What I can say is that the calculations from the Mobile World Congress work that we did a year ago, which found that you can have carbon emission savings of up to 50% by more intelligently placing your workload. Depending on where the municipal power is greener or less green, they appear to match our prototype results.

So we're preparing to submit some research writings on this work. So stay tuned. Hopefully much more will be publicly available soon. And we also have some exciting collaborations on this front, looking at how carbon wear load balancing interacts with energy grids and making sure that data centers that begin to implement these novel solutions, they remain good energy citizens and don't unintentionally negatively impact our energy systems.

Chris Adams: So this sounds like we might be thinking about AI in a few other places then, because there's this phrase that I haven't heard people use that often. Being a good energy citizen as a data center, could you maybe explain that bit more because most of us, we know that data centers use energy, but there's more qualities to the energy than you might have there, and you might not know about the density of demand or load, for example that might be worth explaining.

Miguel Ponce de Leon: So one of the things I can mention is that we are working with, uh, grid utility. In Ireland and with that grid utility that also hosts a data center. As it so happens, we're also working with an accelerator program, a program that is helping startups to look at how you can not just link, but actually be able to take the correct measurements from the green sources, the wind farm locations, and the usage within the data center for its workloads. So again, here it's about leveraging, not just the research we'll say, that would come from research performing organizations or from the offices, the CTO of VMware, but also looking at startups and startups within the space and to link this. And that is helping the utilities understand what type of usage.

And imagine it's a utility that has their own data center. So it's helping them be a good citizen, even within their own environment. But being able to measure it and then being able to take action on it, right? Because that's the important thing is, okay, you've got your baseline, but what can I change about what I'm delivering within that data center, even down to the containers?

How can I move my clusters and pods? Overt and maybe consolidate some of the pods. We're even moving some of that research as well to look at, even with the pods being available, how many of the CPUs are they using within the cluster? So again, it's about being able to help data center owners being good citizens around that space.

Chris Adams: Okay, so there's one thing that came out of that. We spoke before about how previously with three and 4G you might have basically a series of very large antennas blasting stuff out all the time, but now you'd be shifting to a lot, a kind of constellation of smaller antennas, which you might have to spend some time coordinating and time and energy in terms of coordination costs to figure out which ones to turn off so you can provide things working quickly, but also things being more geo efficient.

It sounds like there's something like that on the data center level as well, like where we might have had data centers, which have been a steady 24 7 load, but there's actually scoped scale it down or up a bit. Is that what you're proposing?

Miguel Ponce de Leon: That's exactly it. Chris, you're painting a great picture here of the interconnectivity of it all. But yes, cuz you know, again, as Colleen was saying earlier, we have two main parts to the network. There's the radio side and then there's the core side. And that what we're doing here as part of the VMware team, as well as collaboration with a number of other companies in it, is attacking it from both sides.

And again, looking at how you can really look at that end-to-end element of actually delivering those potential energy savings in order to reach some of the goals. As Colleen was saying earlier, I could say the telecommunications world is really looking at reducing by 2030 and even beyond then by a number of factors from where they are today.

You need to look at all facets of how that's delivered. So yeah, that's, and that's what you hear from what we talk about when we're looking at the startups. We're looking at how to link both the wind farm energy to the actual data center energy that's used.

Chris Adams: Okay, so I can see why people might just say, this is so complicated and there are so many moving parts. I don't wanna think about it. I'm just gonna let AI think about this. And that's why there's this assumption that, yeah, that's gonna be doing the optimizing, but there's an impact in its own right to do that.

And there may be other ways of doing this. Maybe we could talk a little bit about some of the projects that are either in the open or in the world that people might point to allow people to start playing around with some of this stuff for experimenting. Cuz Miguel, you mentioned open RAN. So my assumption is there's an open standard or some open source projects that people might play with and I know that we've spoken about in previous episodes, some software or some tools on the kind of data center side. Maybe Miguel, if I spoke to you about some of the open ran style, things like that people could point to and look at, maybe to experiment with themselves. What projects on GitHub or GitLab or things like that might you point people to if they were interested in this kind of very dynamic new network and data network world?

Miguel Ponce de Leon: Sure, and I'm sure we can give some links as well. So folks, and it's always the easiest thing, right? We give the show notes and we'll give some links off to it. But there is again, to help control the radio access network, there's a thing called the RIC; the short name for it, but it's radio, the radio access network intelligence controller, and there is an open source version of this.

And the RIC uses a thing called, uh, cube flow. So this is a way of being able to host your machine learning model in a way that will be able to interact with the radio network. So there's a couple of open instances there where if you have a Kubernetes cluster, once you have the, uh, open source rig from the ONF the Open Networking Foundation that you can develop, again, some in-house terminology here, but we have what's called the X app and the R app.

So the X app is this realtime application that can immediately, uh, basically turn on and off the radio head ends to help with that energy saving. Or we have the near realtime app, which is more or less, you can spend a little bit more time considering, given the complexity of the number of aerials that are out there, about how you'll deploy and which ones you would turn on and off.

And that's somewhere, again, a number of easy programs written in Python that if you wanted to get up and running and in doing so, you could have an impact on a future well known operators network in your area. Because the whole system is becoming far more open and the app that's developed on the open source projects I've just mentioned, you could then put them on things like the VMware RIC.

We offer a one that's very much telco grade gets deployed in the network, but the app that you've developed in ai, that the model that you've developed can be deployed in the same way. You don't have to wrap it up much more differently than to do so there. There are some relatively easy touch points to, to get involved here.

Chris Adams: Okay. And Colleen, we spoke before about junkyard data centers and I think last time you came on we, you were talking about some of the research that you folks are doing with the VMware to start tracking and measuring the savings, the carbon savings before. So if there, are there any kind of data center projects or orchestration projects you might point people to?

Because I think I've spoken about things like eco visors and stuff before, but I'd love to know what else is going on here actually,

Colleen Josephson: Yeah, I shared some overlap with what Miguel was talking about, to be honest, and I think containerization, Kepler very important project. Yeah, so Kepler is this energy monitoring and tool for Kubernetes containers and you can hook this into some visualization systems. You know, that's one kind of open source project for monitoring data center, energy consumption that I'm aware of.

Very important in that area. Not AI specific, but those two areas are a frequent topic of conversation among people who work in cloud and data centers.

Chris Adams: Yes, Kubernetes efficient power level exporter. I found the link for it actually. That's what it stands for. There's a project. Where there's actually an ongoing conversation and some of the kind of Green Software Foundation I can't remember which repo it is, specifically where Adrian Cockcroft has been mentioning this idea of Kepler as one of the mechanisms to allow kind of minute by minute reporting at a cloud level so that you can actually get some of the numbers to optimize for carbon or optimize for energy usage.

Cuz this is one thing that you don't always have for all your providers, and that's one of the kind of underlying piece of technology used to expose these kind of resources usage figures for each of these kind of pods or clusters of computing and things like that. So we've got about five or six minutes time left, and I did want to leave some space to talk about some of the kind of wackier stuff that we didn't get a chance to talk about last time.

Just before. Towards the end of last episode, we spoke a little bit about things like junkyard data centers, which were like data centers made of various end of life computers and things like that. And Colleen you mentioned different kinds of either zero power or low power. Things like soil powered batteries and stuff like that.

And seeing as we've gone into all this kind of industrial level stuff, I figured it might be worth an interesting to look at some of the other level, like the really low, ultra low power stuff. Is this what you're gonna be going to study or is this some of this work you've done before? Because I think it's gonna be interesting to some of the crowd here, realizing that things happen at the bottom end of the scale as well.

Colleen Josephson: Yeah, yeah. This is the bridging of the two worlds and the data center and these big monolithic or not so monolithic anymore systems are really important to consider, but. We use these telecommunication systems to hook into much smaller devices, tablets, smartphones, and ever increasingly IoT. And what is really interesting about these IoT and smartphone, smaller user devices is they are special because they've been designed to be power efficient.

So the carbon footprint for them is significantly larger in the manufacturer phase. Compared to the device use phase. So they, they have a much higher embodied carbon footprint proportionally than the energy consumption that they use. There's some really interesting work going on here for how can we lower the embodied carbon footprint of some of these massively, or we anticipate them to be massively deployed miniature systems.

And one of the ways that I've worked on that's out of the box is batteries. So we have some ultra low power communication devices that, you know, we can. Begin to use to do something called simultaneous sensing and communication. And one of the bigger footprint aspects of some of these systems are the batteries, traditionally speaking.

So if you can minimize or eliminate the need to have a battery, then you can significantly reduce the embodied carbon footprint. So one of the things that I've looked at is can we actually harvest energy from the soil itself? So this is really early stage research that we're starting to look at UC, Santa Cruz, and it hooks into something called intermittent computing, if you've ever heard of it.

And it's this idea of computing systems that don't constantly have power available and the paradigms that the system operates at. We design data centers, we assume that power is always going to be available or we did. And so now if we have to be much more dynamic and on our feet about when power is available, we have to be able to very rapidly save progress.

Go into power shave mode and then rapidly spin back up again when there's power available. So the intermittent computing community has been really active at connecting the ultra low power and ultra far edge and hooking it into our core networking and traditional communication systems. I can add, if we're looking for off the wall ideas.

Hot Carbon. The first inaugural hot carbon workshop was last year. I'm pleased to share that this year there will be a second iteration of this workshop. One exciting development is that it's tentatively going to be sponsored by ACM Sig Energy this year, in addition to VMware's continued support. So I'm actually working as the publication chair for that workshop.

The submission deadline was yesterday, 5 21. The workshop itself will be on July 9th. Just shout out to those of you listening. So tune in for what's sure to be a very interesting cutting edge work in the sustainable software space, and it's going to be a hybrid workshop. The physical location is in Boston Mass, but now if you go to the website, hotcarbon.org, up very soon should be a registration link so that you can sign up to attend virtually or in person if you happen to be in the Boston area.

Chris Adams: Cool. Thank you for sharing that. Colleen, for anyone who is on the fence, I virtually attended Hot Carbon last year and I basically plundered that list of people for guests for this podcast cause there's loads of really interesting projects going on there. There's also some really nice stuff I. Colleen I just wanna ask about this idea of intermittent computing, cuz it sounds like it's like really almost super serverless. So the idea that there's, it's almost like battery-less, basically the idea that you do the things scale right down to zero and you just basically don't really work until you've got energy coming back in again.

That's the idea behind that.

Colleen Josephson: I think there, there are definite connections to battery-less. An intermittent computing paradigm can work whether there's a battery or not. Trying to work around how charged that battery is and turning off when power is reducing. But yeah, it's a very prominent area of work when you consider battery-less computing and just to connect everything end to end.

You might have these very low power sub microwatt, in some cases, devices at the ultra far edge, but you need to have something. That brings that data back to the cloud, and this is where you have more traditional edge computing, like maybe a server that's at a farm or some things that people are starting to think of are edge data centers that are potentially even mounted to drones, or edge 5G that's mounted to drones.

So lots of really near edge and far edge paradigms.

Chris Adams: Okay, so it sounds like as we have moved from monolithic, gigantic computers to things becoming smaller and more distributed, there is a coordination cost, which is why people often talk about, this is so complicated I'm just gonna hope that AI solves it for me, which is where some of these ideas came from, that of course, you can use AI to automatically work out, erase the environmental impact from computing.

I hope that one thing we're taking away from this conversation is that no, it's a bit more complicated than that and, but there's lots of exciting rabbit holes to drive down. Folks, I've really enjoyed this conversation actually, and I think we're coming up to the time that we have here with the la Do you folks, if in the last few minutes that I might actually ask, are there any projects or things you might point people to that we haven't spoken about that you'd like to give a shout out to before we wind up?

Miguel Ponce de Leon: Okay, because I'm based in Europe, right? There is a coalition that's working together with a number of working groups, really looking at how all the things that we just talked about, but with more specifics. Cuz I know you, you wanna, do you use the word fact checking around what we're trying to achieve here.

So there's a number of working groups in, in Europe where companies are coming together and really looking to do this Now. One of them is the European Green Digital Coalition, right? So the EDGC I'll send you on a link and around it as well. But thi this is a space where, We're having to look at, cuz at the end of the day, we will standardize around what way you're going to measure these changes.

What way It's gonna have an impact on business in what way? When you do offer this fantastic green telecommunication service, what you're saying is energy aware. But there's gonna have to be standardization around whether or not that is actually the case. And so there is a good bit of work in and around this.

Again, I almost seem to use the word research and it's happening. And the thing is, we're right in this maelstrom, this tornado of activity that's just got underway and just seeing how they fit together, it's not a perfect fit I would say there. I couldn't give you, this is exactly the time horizon and this is how it's gonna happen, but I can tell with the level of funding, both from governmental agencies, from companies themselves, from research institutes to lots of public bodies and developers in their own time. It's a great time to be in and around this space of developing software, but specifically for the delivery of green technologies as we see it.

For me, that would be the big pointer, and I'm hoping that someone, one or two of the topics that we've mentioned here, would give Annie developer an opportunity to actually, like you were saying earlier, Chris, have a look at a GitHub project. Have a look at being able to develop a small model, some code, and have somewhere to actually apply it where it will have an effect on your own services in the future.

So certainly that's what I'm excited about and why I'm working hand in hand with colleague on this particular topic.

Chris Adams: Okay. Thank you, Miguel and Colleen. Just as we as we wind out, what would you draw people's attention to apart from hot carbon, of course, which is freaking awesome.

Colleen Josephson: Yeah, I've got two, two things. For those of you listening who might be interested in doing a bit of a deeper dive on telco sustainability, here's, there'll be a i, I just shared a link to our VMware Telco sustainability white paper, so that goes, Into more detail on some of the challenges of the radio access network, the RAN and the data center, and then also coming back up the stack to this idea of data center being good energy citizens.

I want to name our collaborator Adrien Chen at the University of Chicago. He's been active with us in this area and in some of these collaborations and this paper here. Evaluating coupling models for cloud data centers and power grids, that that work is really great for showing some of the problems of how data centers can disrupt the grid.

So I encourage people to go check that out if they're interested in that topic as well.

Chris Adams: Cool folks. We began talking a little bit about a fact check. We. I think we've realized that you can't automatically assume that AI will automatically reduce the environmental impact of everything. And we realize there's quite a lot to it. But I've really enjoyed driving down all these rabbit holes with you folks and uh, yeah, thank you so much.

We'd love to have you folks come on again. Yeah, folks, have a lovely morning or afternoon wherever you are in the world and yeah, see you around. Take care folks. Tira.

Colleen Josephson: Thanks again, Chris.

Miguel Ponce de Leon: Thank you Chris

Chris Adams: Hey everyone. Thanks for listening. Just a reminder to follow Environment Variables on Apple Podcasts, Spotify, Google Podcasts, or wherever you get to your podcasts. And please do leave a rating and review if you like what we're doing. It helps other people discover the show, and of course, we'd love to have more listeners.

To find out more about the Green Software Foundation, please visit greensoftware.foundation. That's greensoftware.foundation In any browser. Thanks again and see you in the next episode.