Environment Variables
The Week in Green Software: Data Centers in Space
May 3, 2023
Host Chris Adams is joined by Anne Currie on this episode of The Week in Green Software. They discuss the potential for data centers in space and how the use of potential death rays might be the way forward in powering these! Not only this, but sweeping changes in Reporting Law, and making Kubernetes clusters into Low Carbonetes clusters are covered too. Anne has a special report on her upcoming book and Chris finds his own variation of Boaty McBoatface!
Host Chris Adams is joined by Anne Currie on this episode of The Week in Green Software. They discuss the potential for data centers in space and how the use of potential death rays might be the way forward in powering these! Not only this, but sweeping changes in Reporting Law, and making Kubernetes clusters into Low Carbonetes clusters are covered too. Anne has a special report on her upcoming book and Chris finds his own variation of Boaty McBoatface!

Learn more about our people:

Find out more about the GSF:



If you enjoyed this episode then please either:

Transcript below:
Anne Currie: Data centers in Greenland are an obvious thing because there's tons of free energy, green energy from ice melt runoff.

Chris Adams: Yeah.

Anne Currie: But one of the issues there is that nobody lives in Greenland to man the data center, but even few people live in space demand the data centers. So I would, I would hope that you would solve the Greenland issue first.

Use all that enormous amount of energy before you,

Chris Adams: Okay, so our site, so before we reach for the sky, let's sort out things down here on Earth. Yeah?

Hello, and welcome to Environment Variables, brought to you by the Green Software Foundation. In each episode, we discuss the latest news and events surrounding green software. On our show, you can expect candid conversations with top experts in their field who have a passion for how to reduce the greenhouse gas emissions of software.

I'm your host, Chris Adams.

Hello, and welcome to another episode of The Week in Green Software, where we bring you the latest news and updates from the world of sustainable software development. I'm your host, Chris Adams. This week we'll be talking about data centers in space, sweeping changes in reporting climate laws affecting the digital sector.

Making Kubernetes clusters into low Carbonetes clusters and a set of interesting looking coming events. But before we dive in, let me introduce my esteemed guest for this episode of this week in Green Software. This week we have Anne Currie. Hi Anne.

Anne Currie: Hello.

Chris Adams: Anne, would you just introduce yourself?

Anne Currie: My name is Anne Currie and I am currently one of the community co-chairs for the Green Software Foundation. I'm a software engineer. I was a software engineer for many years, many years and 30 years I've been in the industry. And so I date from the time in the nineties when we used to build software using the same kind of techniques that we might, that we are thinking about using to date for green software because machines were very weak then, and we had to handle that. So that's my perspective on this.

Chris Adams: Cool. Thank you, Anne. So if you are new to this podcast, my name is Chris Adams. I am the executive director of the Green Web Foundation, a nonprofit focused around us reaching an entirely fossil for internet by 2030. And I'm also the policy chair, the chair of the policy working group in the Green Software Foundation.

Each week we do a run through of stories that caught eyes, or that might be fun to discuss. And everything we do discuss, we share all the links that we can find for you to dive it down your own little Wikipedia kind of holes after this session. Alright, so Anne, should we look at the first story that came up here?

It's data centers move into space to mitigate power consumption and pollution. So this is the story from El Pais, a Spanish newspaper, and uh, they published this story I think of stem. It's about this program called Ascend, which is Advanced Space Cloud for European net zero emissions and data sovereignty program.

Anne Currie: That's not, that's not a little bit contrived at all.

Chris Adams: Yeah, exactly. And the European Union has selected fails Acun space, a joint venture between Thales group in France or Thales possibly, I'm not quite sure if I'm pronouncing that correct. And the Italian defense conglomerate Leonardo. And the plan is to see if you can create space data centers here. And I think the plan here is to try to address some of the energy issues related to data centers on the ground. I found like the initial press release for this, but Anne as a science fiction writer, I figured you might have some records here before we dive into this a little bit more, actually.

Anne Currie: Yes I do and I read the piece and it's, it is an interesting piece from my perspective. I am also a science fiction writer and I have written a series called the Panopticon Series and three of them are set in space and address the technology in space you mentioned is all about Arthur C. Clark. And interestingly cuz Arthur C. Clark was a physicist when he did his. For example, his lunar based novels, he put realistic technology in it, and he often had patents. So he had the first patent on an electromagnetic cannon in space, which uses electromagnetic fields to fire stuff around. Or it was a, an idea of delivery mechanism for getting stuff from the surface of the moon into orbit.

Chris Adams: Whoa, whoa. Did you say that Arthur C. Clark patented the rail gun? Is that,

Anne Currie: did? Yes,

Chris Adams: oh my God, my mind is blown. Gone. Please do. Go ahead. That's, I'm never gonna think of Quake another way that's changed how I think of Quake. I.

Anne Currie: But thinking, I don't think he was thinking about it in the form of a rail, but it might be quite specific patterns. Often quite specific. It might be specifically for electromagnetic canons for the delivery of stuff from the surface of the moon to to lunar orbit. But anyway, he did have the patents on that, which now expired cuz it was quite a long time ago.

But anyway. So that all that stuff does all work. And in fact, there's loads of interesting things you can do with rail guns in space as a way of, as a transport mechanism or a power transport mechanism as well. But anyway, there's, that's a side there. In terms of data centers in space, obviously you've gotten a lot of power that can potentially be generated in space room solar because you've got nothing in the way between you and the sun, and the panels can be in a hundred percent light.

So, but it is a very. Interesting idea. And along those lines, China and India have been coopering for a while. I dunno if they still are, but they were cooperating a few years ago on a space based solar power system. So space based solar power is the idea that you have a giant laser in space and you capture the energy from solar panels and you beam them down, beam it down

Chris Adams: down to a panel on the ground, right?

Anne Currie: Yeah. Which has some kind of giant death ray connotations around it. So it's not, it depends how you do it depends whether you use light lasers or microwaves to get it onto the ground. And then there are lots of, but it is perfectly doable and I think that's a very plausible idea for getting power, using the same kind of idea of using solar and space to get power down to

Chris Adams: Okay.

Anne Currie: re-usable space.

But yet it's, so the idea of data centers and spaces that you build them out there in orbit somewhere, probably quite a long way out. Cause they don't necessarily need to be a near earth orbit. And that's quite busy. And yet you could just be powering it directly. The difficulty with that is always that it's gonna be very hard to maintain that data center, but it did remind me slightly of a story that came up a few years ago, and it's definitely true, which is that Azure have been experimenting with undersea data centers in effectively the size of

Chris Adams: Yeah. Around the Orkney's underwater. Yeah.

Anne Currie: Yeah, and those have similar issues in that you put them in and then you can't maintain them. That's it there. So the idea of having a self-contained, smallish data center that's no one can subsequently touch is not a new one. So it's not utterly, utterly impossible. And of course, Starlink has got the cost of getting stuff into orbit down quite low.

Yeah. So it's not impossible. It's

Chris Adams: It doesn't seem impossible. I, so I think I struggle with some of the numbers on this. Because in this press release, we see something saying, okay, we want to install data centers in orbit, powered by solar plants generating several hundred megawatts of power. Now, several hundred megawatts is a very large data center, so like hyperscalers are between 20 to 50 megawatts of power.

So you're looking at something like that and then, I also, let's just look at say, okay, the International Space Station, they've got solar arrays. Right now they have maybe 120 kilowatts of power coming through, but they're old, and that's two and a half thousand square meters here. So more or less, if you are looking at something which is, I don't know, what is that roughly, that's maybe that's for a hundred kilowatts of power.

You're looking at maybe what, 2000 square meters per a hundred kilowatts. That means. For a single megawatt of power, you're looking at 20,000 square meters, and if you're looking for hundreds of megawatts, that's gonna be 20,000 square meters multiplied by hundreds. That's a lot of solar to have in the sky.

This is the thing I was struggling to get my head around. Things might have got more efficient in the last, say, 20 to 30 years, but surely that is gonna be a heavy thing to get into the sky in under any circumstances, will it not.

Anne Currie: Presumably that is a massively heavy things to get into the sky, but launch has really come down in price a lot. And of course it doesn't have to be particularly co-located with the data center cuz you can use those space-based solar power death rays to the subject of my last science fiction novel called Death Ray.

So you can, you don't have to be right by the thing, you could have those arrays literally millions of miles away in space and beam it back. You do get dilation on the beams if they're too far, but you can keep relaying them.

Chris Adams: Okay, so we could have our data centers in some or, and then the solar panels further out. So they're far away from there. Okay, so that's one thing. Then you mentioned that there, there's different kinds of orbits, right? So as I understand, There's kind of low earth orbit like LEO, which is Starlink, and that's maybe 2000 kilometers above the ground.

And then would that mean you're hidden from the sun so that it's dark for your satellite sometimes?

Anne Currie: Yes, I think it does. I think you have to be reasonable distance out

Chris Adams: Yeah. So geo stationary I think is like maybe bit further out where it looks like you're not moving because you're that much further out. Right?

Anne Currie: And you've asked me a question, I dunno the answer here. I do not know how far you have to go out to be constantly in the sun, but to be honest, it's just less busy further out. So if you can be further out, there are loads of reasons why you might prefer to do that. And yeah, it's just a matter of then beaming the power back.

Chris Adams: Now I'm with you on this, and then this feels like latency's gonna come up at some point, right? Because I'm curious about cause. In the LEO, like low earth orbit 2000 kilometers. We already use CDNs for like to have things closed. So if it's 2000 kilometers, that's one thing, but if it's something, I think geostationary is something like it's either 20,000 kilometers into the sky or 40,000 kilometers into the sky.

So that's gonna be, I don't know if speed of light is what, 180,000 kilometers per second. That's gonna be a significant chunk of latency no matter what you do. And that's even if, if it's just you going straight up and down, if you're going around the world, that's gonna be even harder, surely.

Anne Currie: Yeah, lower orbit latency isn't too much for an issue. It depends, but it, the further you go out, the more there it is. If you had your data center on the moon, latency is about a second each way? No, it's half a second each way, but it's, it is a total latency. You say about a second, which is obviously it wouldn't make for a very good podcast or a Zoom call, but it depends on your use.

And it depends. Just depends whether the latency is an issue or not, because sometimes bandwidth is more of an issue with than latency, so.

Chris Adams: Oh, okay. Yeah.

Anne Currie: Yeah. It all depends what you do, what you're doing with it, and where it's going and how much. I mean, I, I would guess that the whole point was a lot of the things they're talking about, like in that article they were talking about, data that they gathered in space, being analyzed in space, using the, a big array of CPUs, and then boomed back in a more compressed form back to earth.

In that case, latency is not an issue in any way, but if you wanted to move all data centers into space, then latency would be a giant issue. As you say, CDNs and stuff on the edge.

Chris Adams: Yeah. And the final thing, we'll stop on the space part cause there's other things we're gonna talk about. But the final thing that really kind of, cuz I scratched on my head about this, cuz last week we spoke all about using different kinds of ways to keep computers cool. Right? Now when you're in space, one of the arguments seems to be that because it's so cold anyway, you don't need to worry about cooling.

I don't think that's how I understand physics. As I understand it there are three ways to cool things down. There's radiation, convection, and conduction, and I'm not familiar with that many cool breezes in space, so I can't rely on conduction. Maybe convection, not very much. So that just leaves radiation.

And those pictures of the space shuttle with its doors open, it's open to radiate out heat. Because it's got so much heat still. So I feel like if you've got this issue where data centers generate lots of heat and there's no way to get rid of them, this feels like a problem that I don't see how it's gonna be solved by putting things into space that people haven't really taken on board yet.

And Anne I'm struggling with this, maybe you've got some pointers or maybe it does sound just bonkers.

Anne Currie: Not just bonkers, it's, it is, you're completely reliant on radiation and they have quite good things where they have little radiating shapes and stuff that can radiate off heat more quickly. But it's not easy. So it's not easy. It feels to me like you'd be able to do better on the moon because at least you're in contact with something that can conduct heat away.

Depends on how conductive moon dust is and or moon rock is, and I dunno that.

Chris Adams: Heat up the moon until it glows red. Okay.

Anne Currie: But yeah, you're right. It's not a no-brainer that we could just go in. It's not like that under sea ones. People go, oh, that's great cuz you don't have to worry about cooling if you've got a data center under the sea.

And that's true because it could just conduct into the sea and that's fine. But space is not the sea, you can't do it. So yeah, it's not trivial.

Chris Adams: So we have latency, death rays and uh, heating some of the challenges that may face us if we try to put data centers into the sky. But this is one potential proposed solutions to the issues around energy crises or the energy supply for, or sustainability issues related to data centers by the sounds of things.

And thank you for sharing all this, um, about the, provided the science fiction pointers on this, cuz yeah, this blew my mind when I first saw it and I, I think that you've actually shared a lot of useful things on this.

Anne Currie: I think it'll happen. I think it'll happen, but I think there are other things that data centers in Greenland are an obvious thing because there's tons of free energy, green energy from ice melt run-off.

Chris Adams: Ah, yeah.

Anne Currie: But one of the issues there is that nobody lives in Greenland to man the data center, but even few people live in space demand the data center.

So I would, I would hope that you would solve the Greenland issue first. Use all that enormous amount of energy before you,

Chris Adams: Okay, so our site, so before we reach for the sky, let's sort out things down here on Earth. Yeah. All right. Okay, cool. Thank you Anne. Alright, next story is a sneak peek about a new book coming out. Anne, I think this is, this is your thing. It's a coming O'Reilly book called Building Green Software. We were quite excited about some of this cuz there's a couple of co-authors who also been on this here before.

So spit on this podcast. Anne, I'm gonna hand over to you to talk a little bit about this cuz you are far more familiar with it than I am. And yeah, you know better than I do. So please do tell more.

Anne Currie: Yes, so this is an O'Reilly book that we're working on, the O'Reilly book called Building Green Software, which is gonna be there. It's not the first green software book they've done, but it's the first kind of full picture as opposed to there are quite a few good niche ones out there for things like Web development, but this one is all the things.

And there will be me and Sara Bergman, who is a key part of the Green Software Foundation and Sarah Hsu, who is also a key part of that Green Software Foundation. And so we are writing all together and the idea is to net down the thinking that. We've all, as a community, come to agree on about what's the right way to do things.

So it's all based around the idea that there's three things that we need to be good at. We need to be good at carbon efficiency, hardware efficiency, and carbon awareness. So that's what we'll be talking about in the book. So we'll be talking about carbon efficiency in terms of code efficiency, operational efficiency, plus design efficiency through carbon awareness, how designs that allow you to shift around what you're doing.

And we'll be talking a little bit more about that later, I think in this podcast and hardware efficiency. So don't cause everybody to throw away their phones every time you produce a new version of your software, cuz I dunno, you might know the answer to this, Chris, but what has the most embodied carbon per gram of anything in the world?

And my guess would be a chip. And in terms of consumer devices, my guess would be hands down a mobile phone.

Chris Adams: Do you know what? I've never actually thought about that in terms of Okay. A single kind of consumer good in terms of post embedded energy inside it. So it's true that there's a significant amount of power that goes goes into turning sand into silicon and all the other kind of materials there.

Anne Currie: But also the operation, cuz silicon fabs are unbelievably difficult to make.

Chris Adams: Yeah. And of course, and this is actually one thing we should probably talk about in a future episode. When you look at where lots of the really high-end chips are currently made, a lot of them are in Taiwan, which has a very kind of fossil fuel heavy grid. So even if the stuff is really efficient and even if they're just using electricity, that's gonna be one of the problems.

But even then, when you are making these, because most of the ways that you achieve the high levels of heat, don't rely on electric kinds of power they rely on, but literally heat from combusting fossil fuels. You've got an issue there. This is actually something that's changing. There's a really fascinating paper by Doctor Sylvia Medadu who's talking about some of the advantages in heat pumps.

You can now get heat pumps up to the high hundreds of degrees lts basically. So there are lots and lots of things that can be decarbonized now, but. For you to reap those benefits, you actually need to have decarbonized electricity in the first place. And Taiwan is struggling a bit there because it's not a really big place with lots and lots of land and it doesn't actually have much in the way of surrounding kind of shallow water for creating, say, offshore wind or things like that for the time being.

So that's gonna be an interesting one ahead of us. But yes, you're right. I guess,

Anne Currie: need the power back beamed in from space on giant lasers.

Chris Adams: yeah, maybe what they need is a death ray. Yes. Uh,

Anne Currie: in handy in all kinds of ways. If you were Taiwan as well, I would imagine.

Chris Adams: Yeah, let's leave that one there before we get taken off the internet by a advanced persistent threat. Alright. Okay,

Anne Currie: But, but anyway. Oh, that's, that's an aside. The book. The book. So the book, we are beavering away at the moment writing the book. We've submitted quite a few chapters already, so it's all going well. And the idea with an O'Reilly book, the way they do it is it's you as a writer, you writers, you submit the chapters and as the chapters are at least vaguely polished.

Vaguely, okay. They'll go live for people to read in a kind of advanced read on safari. So that will be,

Chris Adams: shortcuts thing?

Anne Currie: Yeah, so shortcut so people will be available, will be allowed to read these things. So the introduction has already gone out and it's not live yet, but we are expecting it to go live quite soon. So we will let everybody know through this when it's live and also the co deficiency chapter.

And after that we've got the various other chapters. But they'll, they'll be available in Safari quite really quite soon. Then the book, the book actually gets physically published and we get an animal. We'll have an animal, but we dunno what the animal's gonna be. So at that point we'll find out what the animal is and the book gets published.

And then, uh, so it'll be available to buy in physical form if you so choose. And also at that point it will also be available. One of the things that we agreed with O'Reilly is that it will also be available under a Creative Commons license at that point.

Chris Adams: Boom.

Anne Currie: even need to buy it or have an O'Reilly subscription.

And because if this is all stuff that is, hopefully by the time we get through this and everybody's reviewed it, it will be, this is just what we want everybody to be doing. This should hopefully be a baseline.

Chris Adams: that's super cool. I did not realize about the actual Creative Commons licensing for that. That's really helpful. That means that brings the barrier right down.

Anne Currie: Yeah, but it'll be a while cuz it takes quite a while for the book to actually come out. So I'm imagining that first quarter, 2024 will be when that's available, unless we really get our skates on and get it done much more quickly than that.

Chris Adams: I have one question, if I may before we move on from this one. I haven't used Safari and I have never written a book, but I have heard horror stories about working with publishers and emailing Word documents back and forth. Is it still that process or is there something like GitHub or what does it look like to write a technical book these days for a technical provider?

Anne Currie: For O'Reilly, you've got quite a lot of different options, and one of them, the one that we are using is just Google Docs, and so that's super easy because Sara is in Norway. Shira, our editor, is in. And I was on the west coast of the us. I'm in the southeast of England. Sarah isn't too far away from me.

She's in London, so that's quite easy to do. But fundamentally, Google Docs are pretty good for that kind of thing.

Chris Adams: Wow. So Sarah, yourself and Shira, it sounds like it'll be

Anne Currie: Sarah and Sarah. Yeah. It's really quite hard.

Chris Adams: Yeah, and this is interesting cause there are a number of existing green software books. So there's one called Designing for Sustainability by Tim Frick, who's

Anne Currie: Oh, yeah, yeah. Which is very good actually.

Chris Adams: Yeah. And then Tom Greenwood from Whole Grain Digital.

Anne Currie: yeah, yeah, that's also

Chris Adams: he had his. Yeah. And I think there are a couple of other books that I've seen come out as a number of other ones, but. The first time I've heard of one of these books, which is actually written by guys who aren't just men basically. So this is actually quite inco. I think this one book may have actually righted the gender balance in the sustainability book canon.


Anne Currie: hopefully, and there is method in our madness on this in that we wanted to make sure that we got on stage to talk about it as well. And we are three women who are very good public speakers, so really we should be able to make a little noise about this.

Chris Adams: Good. I wish you the best and I'm looking forward to some of the shortcuts for some of this. In that case, should we look at the next story?

Anne Currie: Okay. Absolutely.

Chris Adams: All right. Okay. This is Microsoft Scale's workload with Carbon Awareness. Now the actual story is links from SDX Central. As far as I can tell. This is basically a kind of press release talking about Microsoft and Cloud network stuff.

But the thing that was really more specific is actually some of the GitHub issues that we've linked to inside the show notes here. Basically, there is a carbon aware operator for Kubernetes to add in a bit of kind of carbon awareness into it by the looks of things. So if you go to github.com/azure, then Carbon Aware KEDA operator, there's an open source operator that you can plug into Kubernetes to do this.

And I think this is something that we've both discussed before, but I suspect you might have some records on this because I joined a mutual friend of ours, Ross Fairbanks, did some work in this field a while ago as well, actually.

Anne Currie: Yeah. Ross and I used to work on a startup called Micro Scaling Systems, which was all about cluster scheduling. And one of the thing that we always had in mind was adding carbon awareness to cluster scheduling, so moving jobs so that they, uh, get, wait until there's green carbon available. There's green electricity available on the grid.

Now Google have been talking about this as well. They aren't offering it as a service like this Kubernetes scheduler, but the idea, they've been doing it internally, they've been trialing it internally as a way of shifting workloads in time. So, so that they consume green electricity, more assiduously than they would otherwise have done.

And this is the same idea now. As far as I'm aware, this is mostly about being able to compress what's on your machines so you can turn machines off. It's all kind of bin packing on machines, cuz all the machines you wanna compress them in so that there are few machines running because they've got loads of containers running on those machines differently shaped, and they're all squeezed onto a smaller number of machines at times when there's no green power available.

And some machines got to get turned off. And in order to do that you have to have jobs that can wait. So this is not just merely a matter of scheduling, and it's two sides of information here. You need to know. What the current mixes are on the grid and what it's likely to be, which a lot of that goes around weather and grid load.

So it doesn't matter if you've got great weather, but you've got high grid load, then maybe you're still not gonna have any green power. But if you've got low grid load, maybe you've actually just got too much power and you want to be using it. So it's not just about following the sun or following the wind.

You need this information about what the grid is like, what the weather conditions are on the grid, so to speak. And you also need to know what jobs that are running in your data centers are non-time sensitive, so they can be moved around forward and backwards in time. So the same kind of things that might be running on a spot instance, for example.

Now Google pointed out that with their stuff, they're pretty good at labeling their jobs internally, so they're pretty good at labeling jobs and saying this is a low priority. You can just wait if this has to wait. 12 hours fine. Things like video transcoding for YouTube. Sometimes that happens very quickly.

You might notice as a user, sometimes it happens very quickly. Sometimes it happens and it takes quite a long time, and that's because Google. Just go. It's not a high priority thing, so something needs to wait that it will be video transcoding. So you need jobs that are non-time sensitive and are labeled as non-time sensitive.

So I say one of the things that Google pointed out that they struggled with a little bit on this is that they can do it internally where things are very well labeled, but they find it very difficult on the public cloud where VMs are just black boxes and they have absolutely no idea whether the contents could wait until there's green power available.

So for a scheduler, you need both the information on which to schedule, but you also need the information about the jobs to know which ones are schedulable. There's work to do as a user, as well as just install the scheduler. You will need to start labeling your jobs.

Chris Adams: And this is presumably something gets touched on in both books and patterns about the idea of decomposing maybe a particular monolith or a single big program into a number of smaller programs where some bits have to be. Really low latency, responding quickly, any other parts can be moved around so you can make use of either carbon or cost savings presumably.

Anne Currie: Yes, absolutely. Yeah, cuz it, cuz you won't be moving these jobs, I would imagine. You won't be moving them from data center to data center cuz you know, data gravity and all that kind of stuff. But you will be moving them in time. There's really no downside to moving things in time. And so there's no data gravity download.

So it's, that's where the wind tends to be.

Chris Adams: So just, can I check with, so you mentioned an interesting concept here, data gravity. So data gravity is the idea that one data is in one place, you are not able to move it, it's difficult or expensive to move to another provider. Is there like a technical reason for that or what's the thinking behind that?

Anne Currie: Yeah, it's, it's network, it's bandwidth, it's all, and it takes time and blah, blah, blah. But there's an awful lot of data. Gravity is one of the re one of the ways that

Chris Adams: So we're referring to egress fees here. Yeah. So paying money to get things out of your cloud storage. Oh, just by the way, if anyone, you. There's a whole FTC kind of inquiry right now about oligopoly and competition right now at the cloud sector. So this may be something that if you feel like you would like to be able to do more stuff with in terms of green computing, maybe this is a thing that you might want to respond to the ongoing FTC basic kind of inquiry into this stuff.

Cuz I feel that maybe it'd be better for us to actually be able to move things to more than just two or three clouds. Cuz data gravity seems primarily to be a kind of business constraint rather than a technical constraint.

Anne Currie: Yeah, it probably is. Really? Yeah. Technically it's diff, it's difficult, but it's doable. You can get, you could have copies of your data in multiple places. And yeah, you could move it at night. You could copy it on, you could do the whole snowmobile thing, copy, copy it all out once to a bunch of disks and drive them across the country.

It's not an insurmountable problem no matter how big the, the data is, but it is unbelievably costly. So yeah, that is fairly insurmountable because you don't, there's nothing much you can do about that.

Chris Adams: Okay, this is true just like the cost of transmission in some places actually. Alright, so we spoke about the Azure carbon aware KEDA operator. So I think maybe we should actually explain what KEDA says. Cause there's something in the briefing here. Yeah. Kubernetes Event Driven Autoscaler. So the idea would, being that this would automatically scale Kubernetes up so you have more computers or more pods, and then scale it down again in response to various activities, that's what it would be, right?

Anne Currie: That sounds plausible. I dunno, but that sounds plausible. And actually then you just use your normal scheduler and presumably your normal, however you label your pods normally on how many of these do I need to keep alive at any point? So the ones that are less important to you can just all get shut off.

Chris Adams: Yeah. There's also another related project to this called Kube Green, which is a project by some folks in Italy. Actually, this is early on. It doesn't do quite the clever kind of carbon air scheduling stuff, but if you want to dip your toes into this. It literally turns off your pods when you go home. So basically all your staging devices and your developing developer machines at 6:00 PM they switch off and go to sleep just like you might choose to go home and go to sleep.

Also, in the show notes, we've got a link from the Green Web Foundation, where we talk a little bit about this using both Kubernetes and Nomad a while ago. But the stuff from Azure looks really complete and it looks really quite exciting actually.

Anne Currie: That is, so I, we ran like a sustainability track at, uh, QCon, London, uh, a couple of weeks. Sco big conference and it was very successful. One of our, I think our top rated speaker was a woman called, another woman called Holly Cummins, who is a really excellent speaker from Red Hat. I don't know if Holly, but she spoke about that.

Her dream was that we'd have effectively light switch operations, so it should be as confident. Turning off machines on in, in the cloud or in private cloud or public cloud or wherever. As you are turning the light off with the light switch, because when you turn the light off with the light switch, no one thinks, oh, I won't turn the light off just in case it doesn't turn back on again. The aim is that you feel that confidence about all of your systems that you could just turn them off. Because you don't need them overnight knowing that you could turn them back on safely in the morning, and her light switch analogy was excellent. It's like you don't leave all your lights on at all night just in case you can't turn them back on again in the morning.

That would be madness, but that's what we do with computers. One of the best things that you can do with your systems is invest in making sure that you can turn them off and then on again.

Chris Adams: I think that's a useful piece of advice to make sure you can turn your things off and on again, if you'll. Is like a low stakes, but I think you and me, we've been on various projects where we've been afraid to do that. So I'm glad that someone is spelling it out that it's really needs to be this basic.

Alright, next story. This is one that really caught my eye cause this is the SDIA who are a Green software Foundation member, the Sustainable Digital Infrastructure Alliance. The kind of headline is the SDIA welcomes the deal of the European Council and Parliament on energy efficiency directive. This is super like legal blood gump, but basically there is some really, in my view, quite far reaching stuff inside this.

Essentially, there has been a whole bunch of laws being thrashed out about transparency around energy usage for data centers, and this seems to have snuck through in the first quarter of 2023. And there are some headlines which are in my view, which go much further than we are right now. So I'll just read some of this stuff out.

So owners and operators of data centers above 500 kilowatts will need to make the environmental performance public at least once a year. This includes annual energy consumption, power utilization, temperature, heat utilization, use of renewable energy, as in how much renewable energy you're using and where it's coming from.

And we haven't mentioned it here, but it's also water usage as well. Now these are figures which. I, and you've tried to get, I'm sure you can talk about how easy it is to get access to these figures

Anne Currie: Almost impossible. Yeah.

Chris Adams: and now they're like, it's law. Basically this is coming in. People need to be delivering, do their first reporting in March, 2024.

So things, something which a large providers have been pushing back against and saying, no, we can't possibly share of this stuff now it's. Basically gonna be part of the law in all around Europe. So if you're outside of Europe, you still may be okay, but this is quite a precedent to be setting, in my view, cuz yeah, this is something that a lots of us we've been asking for and really pushing for and now you've essentially got one block saying no, this is a condition of doing business in this part of, cuz how on earth are we gonna know if we were on target or not in harming our emissions by 2030, for example.

Anne Currie: Yeah, and in fact, this was discussed on stage at the coupon conference as well this time by Adrian Cockcroft, who is the retired VP of Sustainers. He can never remember what the titles of anybody are at aws, but anyway, who's the big cheese? Of sustainable architecture at aws and he was saying, if you're American, you might think this might, this won't affect you cause it's just an EU directive and your data centers in the US who, who cares?

But it is basically the GDPR of Green. The EU is such a big block. They have so much clout and they exert their will that this is the same way that everybody ended up having to do GDPR. This will be the same. Everyone will have to comply with these things, even if you think that you're in the US and they won't touch you.

The reality of the situation is that this will all spread out in a GPL style until everybody is forced to comply in the same way that we've had to comply with the GDPR.

Chris Adams: That's a win for transparency by the sound of things, but it's probably gonna be a headache for a bunch of people who have to start reporting in less than 11 months for the first reporting deadline for this. There's also something that I call my eye here is that any data center exceeding one megawatt of power, they need to recover the waste heat.

So basically they need to put it to good use or prove that they, it's either technically or economically unfeasible for them to be doing. So this isn't a really interesting one because within Europe at least, and I'm gonna speak about Germany where I live, like 40% of the energy demand is from gas heating things up.

So if you have this being put to actually addressing one of the other big demands for energy inside Europe, that's actually quite a far-reaching one and one megawatt. That's likely to impact pretty much every hyperscaler, cuz hyperscalers tend to be 20 megawatts upwards in size. And as a kind of, I was trying to do some like rough figures, like 500 kilowatts if you're assuming maybe 15 to 20 kilowatts per rack.

So that's, I know between 20 to 40 racks of service based on how efficient your data center might be. That's not that big. That's like a lot of data centers. This is gonna be impacting basically,

Anne Currie: Yeah, so we're gonna see an awful lot of public heated pools

Chris Adams: I hope so. Yeah, absolutely. I

Anne Currie: ringing every data center anywhere, everywhere in the world.

Chris Adams: Maybe this will change how we think about how you build data centers. Like when you build a data center as a kind of big box out of town, Walmart style thing, then it's really difficult to use the heat. But if you're able to integrate the data center into the kind of fabric of the urban environment, then there is

Anne Currie: really want to have, but you don't. But that has issues of its own. You do not want that generally because in the open environment, you want people living. And also you don't want the draw on the grid, cuz often those cities, the grids are already overloaded. So it would be counterproductive to have a whole load of data centers now suddenly located in urban environments just so that they can have a local pool that's heated up using their excess path.

I would say that's counterproductive, but.

Chris Adams: they're providing or generating any of their own, any of their own power on site. That's another thing that some of the new providers are doing. They're basically looking at using batteries on site as a way to act as a kind of anchor customer, but also to provide use. Cause if you have this case where you're scaling machines up and down, there will be times where you should be able to be a kind of active participant in the grid.

Just like how having a kind of read write energy grid, just like we have a two-way internet and you could have a two-way grid, but that's a, another podcast I suspect.

Anne Currie: Part of the grid balancing solution, which is absolutely required, particularly when we struggle with grid balancing at the moment, and that's when most of the grid is powered by stuff that is utterly predictable, like gas or coal. When he starts adding a whole load of comparatively massively unpredictable solar and wind into the mix, then grid balancing is a major problem.

Chris Adams: It gets more complicated depending on how much of a grid island you might be. So if you are connected to other things, You can get stuff from neighbors, but if you can't then it's a bit more complicated. Now there is some interesting news related to that. I, I assure you, were not an energy podcast. There is basically an energy England to Dutch interconnect just announced in the last week and there's a bunch of similar stuff happening around this field, but we probably need to discuss that another time.

And I think we're coming up to the last few minutes of this and I think there's been a question being that's been posed to us that I think Chris, our producer, shared. If we were to launch one data center into space, what would you name it and why? I am. You can have a bit of a thinker like, and I'm gonna go for the dataface out answer that people tend to use when faced with this stuff or what English people tend to use when they get the chance to name things.

Boaty McBoatface, the well-known research vessel, was doing some absolutely fantastic work in the field of climate science. I'll share some links specifically for that because yeah, both face or Richard Attenborough or the Sir Richard Attenborough is its official name. That's a thing. So yeah, that's my example.

That's my answer. Data McDataface. What about you, Anne? you call a data center?

Anne Currie: I dunno, but I can immediately say what I would choose as my naming convention. I would give them culture ship name conventions, the Ian Banks Culture Series, all the AI spaceships name themselves, with some slightly tongue in cheek name.

Chris Adams: Of course I love you one and

Anne Currie: Oh yeah, exactly. Yes. Yeah, so I would, I would give them culture names.

So that's up. There's an exercise for the listener to come up with a whole load of, in fact, I believe there is a cultureship named generator online that you can, it will automatically, or to be a perfectly honestly ChatGPT, but almost certainly supply you with culture ship names that it has made up. So I would defer to Ian banks and in the, and,

Chris Adams: And a generative AI large learning model. For naming our servers. I guess that's a circular of nothing else. All right, I think that takes us up to the time we have here. Okay, that's all we have for this episode of The Week in Green Software. All the resources for this episode are in the show description below, and you can visit podcast.greensoftware.foundation to listen to more episodes of this particular show.

Thank you very much, Anne for joining us, and hopefully see you on one of the future ones. So bye for now. See you around Anne.

Anne Currie: Goodbye.

Chris Adams: Hey everyone. Thanks for listening. Just a reminder to follow Environment Variables on Apple Podcasts, Spotify, Google Podcasts, or wherever you get to your podcasts. And please do leave a rating and review if you like what we're doing. It helps other people discover the show, and of course, we'd love to have more listeners.

To find out more about the Green Software Foundation, please visit greensoftware.foundation. That's greensoftware.foundation In any browser. Thanks again and see you in the next episode.