Environment Variables
The Week in Green Software: FinOps, GreenOps and the Cloud
June 20, 2024
On this episode of TWiGS, host Anne Currie is joined by Navveen Balani of Accenture and fellow GSF member. This conversation navigates the landscapes of, and intersections between GreenOps, DevOps, and FinOps, as well as the vital role of Infrastructure as Code in marrying financial and ecological efficiencies in cloud operations. Lastly, they tackle the intersection of cybersecurity and AI development, emphasizing the need for green software principles to fortify AI systems while minimizing energy use.
On this episode of TWiGS, host Anne Currie is joined by Navveen Balani of Accenture and fellow GSF member. This conversation navigates the landscapes of, and intersections between GreenOps, DevOps, and FinOps, as well as the vital role of Infrastructure as Code in marrying financial and ecological efficiencies in cloud operations. Lastly, they tackle the intersection of cybersecurity and AI development, emphasizing the need for green software principles to fortify AI systems while minimizing energy use. 

Learn more about our people:

Find out more about the GSF:



If you enjoyed this episode then please either:
Connect with us on Twitter, Github and LinkedIn!


Navveen Balani: Definitely, I would say there is some synergy between security and green software and certain, I would say, features of green software principles can also be applied to security domain, right, to make it more energy-efficient.

Chris Adams: Hello, and welcome to Environment Variables, brought to you by the Green Software Foundation. In each episode, we discuss the latest news and events surrounding green software. On our show, you can expect candid conversations with top experts in their field who have a passion for how to reduce the greenhouse gas emissions of software.

I'm your host, Chris Adams.

Anne Currie: Welcome to another edition of the Week in Green Software, where we bring you the latest news and updates from the world of sustainable software development. Today, I'm your host, Anne Currie. So you're not quite hearing the usual dulcet tones of Chris Adams. You will have to do with me instead. But as usual, we'll be talking about the world of green software and what's, what's going on at the moment.

And today we're going to talk a little bit about how being green matches with FinOps, which I think is very true. And I think that's a really important part of the story. So we'll be talking a little bit about that. We'll also be talking a little bit about code and code efficiency, which I'm much more...

You have to be very careful about code efficiency. So that's, that's the part that we need to be really careful about. What's the context there when we talk about code efficiency. And finally, we'll talk about the intersection of cybersecurity and AI developments, and we'll be talking a little bit about GreenOps.

And there is a new Green Software Foundation project, GreenOps project, which is aimed to look about how we can embrace AI and use AI without totally throwing all our green principles out of the window. And I think that is absolutely doable, but we need to think about it. We need to go in, in a very knowing way.

So as I said, I am your host today, Anne Currie. But first I'm going to introduce our guest, Navveen. Navveen, do you want to, do you want to introduce yourself?

Navveen Balani: Thank you Anne. Hi everyone. I'm Navveen Balani. I'm the Managing Director and Chief Technologist for the Techno and Sustainability Innovation Group at Accenture, working at the intersection of technology and sustainability. I'm also the co chair of the Standards Working Group and the Impact Engine Framework at the Green Software Foundation.

I'm a Google Cloud Certified Fellow, a LinkedIn Top Voice, and author of several reading books. Very glad to be part of this podcast.

Anne Currie: Thank you very much, Navveen. It's very good to have you. So just a bit of context for me. My name is Anne Currie. I am one of the co-chairs of the community group of the Green Software Foundation. And I am also one of the authors of the new O'Reilly book, Building Green Software. So that's, fills you in a little bit on my background.

So before we dive in to the articles this week that we're going to be talking about, it's just a reminder that everything we talk about will be linked in the show notes at the bottom of the episode. So you can, you can read the articles that we're talking about. You don't just have to rely on us telling you what was in the article.

So, well, let's, let's move first to the first article from today, which was in Computer Weekly. And it was called Green Coding. It was, it was a, basically a puff piece by a company called CloudBolt who do a look at code efficiency, cloud frequency. So taking it beyond FinOps, beyond dollars, pounds, and pennies.

So actually, but it was a very good article, I thought. I was very pleased to see it in Computer Weekly. So it was fundamentally, it was about how GreenOps and FinOps are very aligned. They're very combined. And, and I can, I'm in complete agreement on that. It's a good article. It doesn't say, tell you anything that you probably, that you won't know alreby knowingwing that FinOps and GreenOps are quite aligned and they're all aligned through the fact that in the end, being green, a lot of being green, not all of being green, but a lot of being green is about cutting down on how many machines you, and how much electricity you are using to run your systems, which, which generally speaking cuts down on the cost.

So cost is somewhat of a proxy measure. It's not a perfect proxy measure, but it's somewhat of a proxy measure. So the question that Chris Skipper, our excellent editor has left me and Navveen to discuss is about considering the role of infrastructure as code in enhancing cloud efficiency. How can developers ensure that their infrastructure as code implementations are aligned with sustainability practices to reduce both costs and environmental impact?

So Navveen. What are your thoughts on that subject?


Navveen Balani: I think that's a great question. Yeah, agreed. Developers need to embed sustainability as part of the infrastructure as co implementation. And the frameworks that I suggest developers apply is based on our Software Carbon Intensity specification from our Green Software Foundation which also recently received the ISO standard recognition.

So, for those who does not know what SCI is, SCI is a specification to measure the carbon emission of any software application and it promotes three key levers. Writing energy-efficient code, using less hardware for same amount of work, and making applications carbon-aware. And if you apply this strategy to infrastructure and the code, first you start with writing energy-efficient code.

So developers can focus on optimizing resource utilization by right sizing resources and implementing auto scaling. This means allocating only what's necessary for each workload, and also adjusting dynamically based on the demand. Second strategy is around using less hardware for same amount of work.

This basically involves automating resource management like automating the shutdown of non-essential resources during off hours and starting them during peak times. Even conserve energy and cut cost. Also, tagging and monitoring resources usage helps identifying optimization opportunities and eliminates waste.

You can also go with serverless architectures in your ISE code, particularly, it's quite effective as they scale with demand and eliminate, let's say, any provisional requirements. And finally, the third strategy is how do you make applications more carbon-aware. And that's where, as part of your infrastructure code, you can say that I want to deploy a particular workload in a clean free region so you can basically take, apply strategies like region shifting and time shifting and selecting cloud regions which are running on renewables and also maybe deploying workloads or scheduling jobs when the carbon intensity is low.

So all of these strategies can be definitely applied and designed as part of your infrastructure code.

Anne Currie: That's a, that's a very thorough answer and there's loads to unpick in there. Lots of different things. And some things I think at the moment are very aligned with FinOps and cutting your costs. And some things are not aligned yet, but are almost certainly going to become aligned in the future. So for example, you talked about operational efficiency and automation, which is interesting.

Operational efficiency, if you use for your machines. And if you use less electricity, your bill goes down. So that's all good. So in that respect, your FinOps and GreenOps are really well aligned. You know, fewer machines, less stuff, less carbon goes into the atmosphere and it's all fantastically good. And in that respect, I would say that That, that FinOps, that, that, that, you know, your, your hosting bill is a really good proxy metric for your carbon emission.

But of course it's, it's, it's almost stupid, it's so obvious to say it, but it can't just use your cloud bill totally blindly as a, as a guide. Just, you know, what I, when I used to do start, startups in my youth, quite often Azure or, AWS would give you loads of free credits. And then of course, but that doesn't mean that it's carbon free.

So, so there are times when, you know, you just, but you just need to use your head, don't you? Sometimes, obviously, you've been given a discount, but it's not green. It's just a discount.

Navveen Balani: Yeah, yeah, I think that that's, that's a good point. Because if you look at cloud, right? Cloud has infinite resources, right? So, it doesn't mean that, so you have to use it responsibly, right? So, you can bake in energy efficiency and sustainability, right? So, definitely, you have to look at how can you lower the carbon emissions.

And now, and there are also dashboards available from cloud vendors, right? Which at least gives you, some approximation, right? How, what, what is your carbon footprint of your application?

Anne Currie: Yeah, yeah, yeah. I mean, yes, they do provide really good tools and, and, and it's, it's a cut cost off really is the best possible metric you could potentially use, but it's where you've got the tools, it's where you've got the data. So, you know, it's, it's quite good from that perspective. Sometimes you just have to take what's good enough.

And, but so something else you mentioned is automation and obviously the, the really good operations is all automated these days. It's auto scaling and it's using the you know, not just in the cloud, but on prem as well, but actually I, in the. You can just do a lot of stuff manually, you don't have to leap straight, if you, if automation is too scary and you know, it's too much of a leap, just going through and turning off machines at the weekends, even manually, identifying machines that are over, that are over provisioned can actually, bizarrely, I think, I suspect in, well, and in fact, I've seen It might be the biggest carbon reduction you ever do, it's the simplest thing and the least techie thing.

So what do you think about, even for automation?

Navveen Balani: Yeah, totally agree. I would say even just turning off machines. Yeah. I mean, that's just manually, right? Would definitely save also the cost also, as well as the carbon emission also, right? Man, especially if you've turned on GPUs, that would affect. So, yeah, But actually, I think, also, I think if you look at the infrastructure, right, it's, it's, if you, if you break down the infrastructure in two parts, right, production and non-production environment, you can definitely have a lot of savings on non-production environment, right, because it doesn't need to be on always.

Production definitely needs to be on 24 by 7, but you are definitely have a lot of improvements that you can done on your non-production environment, dev environments, people, right, and I've seen customers having more non-production environments, right, than production ones. 

Anne Currie: Indeed. And it's so ungreen. It's there's, there was another interesting thing that you talked about in your when you were talking about the SCI, which is carbon-awareness, which of course we know is the most, it's actually the most, it That's the code efficiency and operational efficiency are all good for kind of short term mitigating the harms, but in, but to actually take full advantage of that, you know, the soaring production of cheap energy for renewables, we need to demand shift to when the sun is shining or the wind's blowing.

But the interesting thing about that is that although that's by far the most interesting thing in being green, I would say, it's the bit that at the moment doesn't really save you any money because most countries don't yet have dynamic pricing. So dynamic pricing, so, so what's your thoughts? What are your thoughts on dynamic pricing and when it's coming?

So dynamic pricing is basically when the price of electricity changes through the day, depending on how expensive it was to produce, which usually means, you know, at times when the sun's shining and the wind's blowing, the power is cheaper than others. And that's, that's now very common in certain countries like Spain, but very uncommon in other countries.

Navveen Balani: I think definitely that's a good concept that will promote more sustainability. Typically, if you look at cloud providers, like let's be, to give you an example of Google cloud, it at least gives you now, if you're deploying something on a region, it tells you that it's, it's a low carbon region and hopefully in future you will have the dynamic parsing also, right? It will also give you the time when you should run the workloads. And there are a lot of workloads we doesn't need to run 24 by 7, like batch jobs. We get a lot of emails, right? Of all those millions of emails, right? You'll be sending, right? For promotions. All those can be run on time where, where there is least intensity.

And definitely if you have, if you have a cost, if the cloud provider gives you a cost that this is a good time window, and this is less costly and then all, all, all the activity which doesn't are not critical enough can definitely take care of the dynamic pricing. So I assume in future, I mean, we can see the trend, right?

Maybe where you have, I mean, it's all about data. If you have the data from the grids available to the providers in future, then we can definitely definitely tap on it.

Anne Currie: Yeah. Yeah. I agree. It's, it's a bit of a shame. It's, it's where there's the hole in the, in the alignment of green ops and fin ops at the moment. That's, that is incredibly green to demand shift, but you don't necessarily get money off for doing that. You know, moving to that green region is incredibly green.

But it doesn't necessarily save you money, but it will do once dynamic pricing comes in.

Navveen Balani: I also, I think if I also feel if regulations also are there, right, with regulations around carbon emissions reporting. Especially, I think, the EUA Act just talked about reporting the carbon emission, but it's a dog bone mitigation. But at some point of time, I would say, when you have a reporting mechanism also, and everybody have to comply for it, I see a lot of these trends coming, right, a new innovative way, right, to lower the carbon emission, right?

So I think regulations at some point of time will also enable, right, a lot of these, I say, innovations, right, to come up.

Anne Currie: Yeah, I agree. Oh, I meant something I meant to mention that, that's aligned with what you were saying earlier about automation. There's the CNCF, the Cloud Native Computing Foundation. So another of our, of the Linux foundations out there. They describe, and it's, GreenOps equals Fin, GreenOps equals FinOps plus GitOps.

So basically they're saying GreenOps is automated FinOps, which is an interesting one because it feels to me like they're really saying there that GreenOps is good FinOps. Uh, and oddly enough, FinOps often say, well, FinOps is just good ops. GreenOps is just good ops. It's, it's which is interesting, which I think that people often don't really appreciate. No, sorry, I'm taking the final word there and I need to, I will leave the final word to you, Navveen, on FinOps and GreenOps.

Navveen Balani: I'd like to end with what you said, right? So it's, we started with DevOps where you decide to automate something, then you, then the FinOps came, right? So we, because cloud resources were getting expensive, right? And now we have GreenOps, right? So all, we have to look at it in holistically, right? Across DevOps, FinOps, GreenOps, right?

And ensure you take care of both the cost and carbon, right? And keep it under control.

Anne Currie: Yes, totally agree. Totally agree. Right. So we're going to move on to the next. So that's the first article we're talking about there, I would say is extremely uncontroversial. Operational efficiency is like a total win all around. The next article, which is about why you should switch to green coding for a net zero feature, which is a LinkedIn article from the CEO of CSM Technologies, I think is vastly more controversial. Not because it's wrong that you should write a more code efficient, more efficient code. But because I think that there's a lot of context around it. So, so he's written a lovely article, links, as I say, links in the show notes, saying we should all be, be coding more efficiently, which is, which is nice.

But, and it ends with the, with the line, we should change, change, save the world one line of code at a time, which I find massively controversial because I think that when it comes to code efficiency, a lot of business, it's just not the right thing for a lot of businesses to do. It's too expensive. What they should be doing is putting pressure on suppliers to write their code efficient, you know, write the scaled code efficiency.

I think that it can really waste time going down, people going down that rabbit hole and their, their bosses were very right to say, "don't, I'm not going to do it" because it would put you out of business if you rewrote all your systems in Rust or C. So I think that is, it's, it's, it's an article that's true, but only true in certain contexts and not in others.

So Navveen, what's your, what's your thinking on it?

Navveen Balani: So I would say we need to look at this holistically, particularly around green software, focusing on, let's say... I would say three dimensions we have to take. First is having developer training and implementation of green software. Second, we need management buy-in. And third, I would say, it's the culture shift that needs to happen.

And if you look at green software, right, when we when we started, when you all started with the foundation, three years back, green software was a very relatively new area. So we need to provide training and certification in this area so that developers are aware of, right, how they embed sustainability in their day to day work.

Apart from, I would say, the training, a developer needs to have accessible tools, right. Now we have the SCI specification, the Impact Framework, Carbon-Aware SDK, right, which, and there are a lot of other open source tools also available now, which can make it more actionable and developers can actually embed them as part of their DevOps process.

And once the measurement is done apart from code, right, it's also the optimization piece we talked about, right in the infrastructure as code earlier also, right, how you take it all together, right, and try to optimize not just the code, right, but also the resources which are running the hardware, the resources which are powering those applications.

And as I would say green software practices gain traction, I would say securing management buy-in also is essential, right, for widespread production. For instance, highlighting the business benefits is crucial. For instance, implementing green coding, right, can lead to also significant cost savings by, let's say, reducing energy consumption and optimizing utilization.

And as you mentioned, right, it's not just Our footprint, but the scope, the footprint of our suppliers to ensure they also follow the same standard methodology. And that's where I think it comes to the culture change, right? That we all need to go through, particularly for, for green software. And we need to look at how we can embed green software, right?

Going forward in all our work, right? Similar to, let's say, similar, we do it for security, right? So when we, when we had security at 10 years back, right? Now we have security by default, right? We don't talk about that application needs to be secure. We assume application is secure by default. Similarly, if we embed green software, not just code, but across all, all the layers, then we can ensure maybe over the next four, five years where all applications, the new applications that we build, right?

Have a green software principles baked in. So I would say, I mean, it's basically a holistic approach that would be required, right, to, right, from enabling the development community, the tech community where the foundation also, like, for foundation, like, Green Software Foundation plays a critical role.

The management needs to buy in, and also the culture change that needs to happen, right? And the culture change also needs to happen, I would say, at the universities and schools, right, where they can start educating green software early on, right? Similar to the way we have learned object oriented programming, right?

That's by default. We have learned over the, I mean, over the last I would say decade, few decades, right? If we have green software, same as object oriented programming concept, then I think whatever application we build in future, right, we'll have green software baked in.

Anne Currie: Yeah, I, I may agree up to a point. I agree with you, but I think we, I, I'm a big believer in separating out two types of developers. Obviously there are loads of different types of developers in the world, but so, but two types of backend developers, so front end developers, this is almost a separate thing.

But backend developers, you've got people who just working in an enterprise and the code that they're producing is not going to ever be deployed. You know, it's, it's the code that they are writing is never going to be run by billions of people in their own data centers. And then you've got people who are writing platforms and the whole purpose of the platform is to try and get some billions of people to, or at least millions of people to write this code, to run this code.

And those people, they absolutely need to write efficient code. When I think everybody needs to, should be, could, should get used to getting out their performance profiler and just making sure there are no egregious performance problems with their code. Because performance problems are your code's slower and you're burning a load of carbon and it's total waste.

So all again, very aligned with the business. You want your systems to run fast. Your customers want your systems to run fast. So, so having a decently performance system is good. But beyond that, you probably don't want to be writing code yourself, which is massively efficient because that takes a long time, but you do want the platforms you're running on to have done, to have made that investment.

So it's, it's kind of like, there's a lot of context here, isn't it? Are you writing code for mass use or are you writing code that is not really for mass use, which is, which is interesting. I think that's a subtlety that we, like, for example, lovely article though this was, did not point out that difference.

Navveen Balani: That's a good point. So, yeah, especially applications, right? Package applications which will be used, let's say, by millions of developers worldwide or users worldwide definitely needs to make that in right to ensure that for instance simply like all the large language models a good example right all all generative applications will be used by millions of applications millions of developers so how can you make the AI more efficient right both on the user side who is creating let's say the prompts right to Create in an efficient way so that the round trip is reduced and secondly on the backend side, right?

How do you have a low cost efficient energy-efficient model? That's why you also seeing a lot of LLM models are now Talking about the small language models more energy-efficient more compact, right? There is a trend where I would say right where Organizations are now looking at energy efficiency also, right, as part of the applications or whatever work they have been doing.

Anne Currie: Yeah. And of course that's all driven by cost. It's taking us back to our previous thing about cost and cost and green are very aligned. And the good thing about, well, the bad thing about AI is very costly. The good thing is it's driving quite a lot of efficiency improvements. So, I mean, I talk about this every time I'm on here, that, that Python has got a lot more efficient because, because of AI, that they've rewritten all of the code, core, core libraries in Rust. And of course that's, that's a perfect example of they are the kind of people you want to be writing super efficient code. They can save the world one line of code at a time because so many people run Python. But you want to be getting that out of your platform and not having to do it yourself as a, as a Python user.

You don't want to have to change to Rust yourself. You want to be able to get the value of Rust whilst still using Python. But yes, yeah. So, so that is all very interesting stuff, but yeah, very nuanced. It's all at every degree of it. It's what my, to my mind is what makes green interesting, is it's not simple.

It's not trivial. You have to step back and you go, "where am I? What am I doing? You know, what's, where, what's, how do I fit into this? Where does mine, where is my effort best applied?" I mean, you're obviously part of the SCI, which covers all of the things, you know, operational efficiency and code efficiency and demand shifting and shaping.

What's your interest? What do you like the most out of those things?

Navveen Balani: I would say from a, I think from a developer standpoint, right, depending on your roles, right, so SCI, I would say is more inclusive in terms of, depending on roles, right, whether you are a developer, architect, data scientist, right. All are various parts to play, right, to reduce the carbon emission and make applications more energy-efficient.

So, depending on your role, for instance, if you are building, or you are a developer writing code, right, then you can really focus on energy efficiency. And it's not, as you mentioned, right, it's not just moving towards a C or C++ language, right, which is more efficient, but it's, so you have to basically look in the context of the work you are doing and trying to optimize it, right.

So you have to do that trade off as a developer and how, what libraries access to make it more efficient. Second, I would say the whole hardware optimization, I think in terms of where all the DevOps, cloud, cloud architects comes in. How there are various custom chipset from various vendors, right? How can you best utilize from an infrastructure point of view?

And third, I would say is more strategic in nature, right, in terms of how do you bake in the whole carbon away computing concept, because that's new. You need to have data providers, you need to tie up with various licenses which are actually costlier, right, if you look at getting the real time data, right, from various providers.

So how do you bake that in in the application to more of a strategy kind of work and thinking? So in that way, I would say it's, I mean, depending on your role and context, right? I mean, whether from developers to architect for data scientists, right? Each can find definitely a value for, in SCI and then try to reduce their scope of work.

Anne Currie: Yeah. It's so, it's interesting. When I first heard about the SCI, I was a bit, I was a bit dubious about its value, but I have completely changed my mind on that as time goes, especially because I teach, I teach people who are green software and, and, and. One of the things that often comes up is people wanting to be able to do like for like measurements.

And I think that I originally thought the SCI was about a standard that you could compare between applications. And that was where the value would lie. And I was a bit dubious that we could realistically do it. And, and now I've realized that, that I like that the SCI has stayed fairly woolly and loose.

It's more conceptual than it is a specific implementation. And I like that because really it means that companies could choose their SCI score, they can choose how to define their SCI score for their applications and choose what's appropriate. What it's going to, what it's going to, what the denominator is going to be.

So it's per user or per transaction or per, so everything, something that's specific to them, and then it's essentially for like, for like, so you can say, well, last year it was this, and the next year is this, and you can average over time so that you're, you, you know, saying you don't say, well, I'm comparing a sunny day with a non sunny day, or, you know, all those kinds of things.

What I like about the SCI is it's, is it's very kind of conceptual high level nature that forces people to think, "well, actually, how do these things apply to my system?" You've got to use your head. You can't just, you know, you can't just follow it. You can't follow it blindly because it doesn't make any sense if you do that.

You have to say, how does this apply? Forces you to think, which I like.

Navveen Balani: Yeah, I think that's a good concept. Particularly if you look at SCI, right? It's for an application, so you have better control rather than giving you the carbon emission of all the applications. Right. Which typically is given by various cloud providers given an application, then you have a better control, as you mentioned, right?

You can define your own boundary and architecture and calculate the SCS core, right? And the intent is to. Basically, as you deploy new versions, right, the intent is the SCI, you should look at how you can reduce the SCI score, right? We can't achieve zero, but definitely across releases, right? How can you make it have a lower SCI score?

And the point you made about the comparison also, right? So you're comparing your application versus your previous application that you have deployed. It's not about creating two applications from two different organizations, right? We're not there yet. It's about currently using this methodology, right, for your own application and trying ways to reduce it.

Anne Currie: Yeah, which makes all the sense to me. So many years ago in my, when I was more youthful, I used to work in retail and in retail, like for like comparisons are very important. You want to be able to say, well, this year we've made more money than last year, but you can't just say this year we made more money than last year.

You can, but it's not all that useful. What they actually want to do is say, it's per thing. So in retail it's often the kind of per square foot of retail space, balanced for kind of like, well, how expensive was that retail space? You know, so you're not comparing it and say, well, this year we made more money on the same on, on the same amount of floor space, but, you know, it was in London versus it was in the middle of, you know, of the desert. It's, it's kind of like you, you've got to, you've gotta come up with your own, like, for like measure so that you can say, well, is our business improving or is our business getting worse? And the SCI is exactly the same. It's, it's the concept of like, for, like, it's for you to check Ron North's for you to check against other people.

So yeah, it's yeah, I, I, I, I've been completely won over to the SCI. I was highly dubious to start with. Right.

Navveen Balani: Good example. Yeah, that's a very good example of a retail. I'll also use that.

Anne Currie: Right. So now we'll go to the final, the final thing we're going to talk about today, which is, we just touched it from a slightly, again, a slightly more holistic perspective. So this was an article in a Silicon Republic and it was a Q&A with the chief security officer of a, an AI company. It was talking about.

Basically, her premise, and I totally agree with it, is that security, cyber security is very aligned with being green. It wasn't, it's not, it's a bit thin as an article, it doesn't give you an awful lot of information, but I think basically, yeah, that there's the, the idea that, that I think we should be discussing is, is security and, and green aligned?

And you, you, Navveen, you've talked a little bit about that, about in terms building security in is, is like, it is, we've, we've learned to do that and we should learn to do green things, build green things in, in, in the same way. But separately to that, are there security benefits to being green? What do you think?

Navveen Balani: I would say, yeah, definitely there's synergies between security and green principles. So I like to again give that example of SCI, right, if you want to break down the methodology into three parts, right, making applications more energy-efficient, right. So, if you look at the security algorithms, right, how can you use, how can you basically optimize the security algorithms to, let's say, use fewer computation resources.

Particularly, if you look at the security stack, right, they also have evolved from various encryption and cryptography software, right, and now I think you have various key ciphers available across different dimensions. So, they're already following this, I would say, backtest, right, of making encryption security, right, more performance and more easier to adapt. So in that case, I would say it's more aligned towards the algorithms that they're using are more efficient, right? As compared to what it might be, let's say five years, 10 years down the line for, for security protocols. And similarly, I would also say new strategies can also be applied for security scanning.

For instance, vulnerability scanning is one commonly used, right, to identify any threats, maybe in cloud or maybe in desktops and other systems, right, that can actually run, take the advantage of running it on a time, right, where the carbon intensity is low. So in that way, it can apply certain green software principles.

To run all those scans where the carbon intensity is low and also save on the carbon emissions. So, definitely I would say there is some synergy between security and green software. And certain, I would say, features of green software principles can also be applied to secure the domain, right, to make it more energy-efficient.


Anne Currie: And so, so something that actually I'm, I, well, you did mention a little bit earlier is, is security is the perfect example of where hardware, using the right hardware for the job massively cuts your emissions. So dedicate the right kind of chips that are designed for for encryption are just so much more efficient than using general purpose or CPUs for, for that, for that.

So, yeah, we wouldn't be able to do what we do these days if it wasn't for dedicated chips. Oh, and so, and oddly enough, this does also map to some of the stuff that we said at the beginning about manual ops and manual FinOps, that's, it's amazing how many systems are through machines that are kind of like, nobody's, everyone's kind of forgotten about them.

They're not keeping them patched. They don't really do anything useful anymore. Those are your backdoor. Those are the ways that people break in and they're just wasting electricity. So even in building green software one of my co-authors, she, she brought up the fact that, that it's interesting that security, well, that are very much an example of a waste of electricity.

They, they're, they're wasting. So, so something like a denial of service attack, the whole purpose is to burn your electricity in your systems and burn your system so that your systems don't have anything, any time to do the thing that they're designed to do. You know, the thing that has value for you, instead, it's just burning your systems up burning electricity, running up all your bills to do something which is bad for you.

So having a secure system that, and things like applying the latest patches so that you're less, less exposed to denial of service attacks is green because denial of service attacks are very ungreen, they are very dirty. Same, it's, it is quite interesting, isn't it, from that?

Navveen Balani: Yeah, that's why I think the provisioning the right hardware, virtual machines, especially, right, provide various cloud providers. Right. They all, they now provide a managed services, right, to detect various denial attacks. And I assume, right, the underlying hardware that they are using, which is, which should be used by millions of applications, right,

would be definitely more sustainable, more energy-efficient, right. And, and more scalable.

Anne Currie: Yeah. Yeah. Securities are really interesting in that, that's it. So FinOps is just really a, a, a pretty much, except with, for, you know, your free Azure credits or whatever, is pretty much a group, the direct proxy measurement for carbon emissions. And, and likely to become more so in the future when we get dynamic pricing.

But security is not a direct proxy measurement is just, it's just that there are a lot of, you know, best practice in ops is also best practice in secure ops is also best practice in green ops. You know, they're, they're, they're kind of like, you can't use the number of hacks you have, then the number of attacks you, you fall foul to as a proxy measure, you, well, maybe you could, I think that would be a bit complicated.

Proxy measures for carbon is how many times your data gets stolen, aligned rather than proxy. It's interesting. I mean, so we just, we've talked about those, those three things, but have you run across anything interesting at the moment that's, that you think that our listeners should, should hear about? And

Navveen Balani: So, yeah, I would say for the, on the Green Software Foundations, specifically, right, we are working on green AI and so we are trying to look at how we can extend SCI to take I mean, how we can do SCI measurements for large language models generative AI models. So this is something we are actively, I would say, working towards from the foundation perspective.

And if you look at from an SCI perspective, right, we want to have various extensions to SCI, for instance. How do you do SCI for web applications, backend applications? And make it more easy to measure, right, different parts of the code and make it easily available to developers so that developers can measure their, their part of the overall carbon footprint, right so we can make it more accessible.

So that's, that's one thing I think we had from a foundation perspective, looking at how we can make it SCMO extensible to various other use cases.

Anne Currie: That's, that's very interesting. That is the, yeah. 'cause obviously Ai, AI is the, is the workload of the, it's on everybody's lips at the moment. It is... and so I, I saw very interesting charts from all the, the Economist, I think the, the other week that, that said, you know, that's, that showed the enormous amounts of power that was currently being used on ai, but still less than the amounts of power being used on Bitcoin.

So it is just worth reminding. And of course, bitcoin is very aligned with our last conversation about security and what people wanting and people who are attacking you wanting to run up your energy bills, because quite often what they want to do is mine Bitcoin on your, on your machines that you're not properly watching.

So security sweeps are a pretty good way of identifying machines that are burning power totally unnecessarily. From political... I, I quite liked to keep my eyes not just on the Al news, but the really good news news last week from a political rather than a technical perspective was the world's just got its first climate science trained president in the new female president of Mexico is a, is a climate scientist by, by trade and training.

So, I, I, I would be very interested to see what affects that out on the country. Any other interesting political news that's good news, do you think?

Navveen Balani: No, I think I've yet to catch up on,

Anne Currie: Well, I'm quite nosy, so I keep my, I keep my eyes open on all things. I would say there's, there's loads, actually, there's, there's a lot of good news going on at the moment. Texas is now a massive solar producing state. It's, so, yeah, there, there is, it's, The world is changing in a positive way. I, I like to, to, to keep reminding everybody that we are not doomers at the Green Software Foundation.

We, we are doing this because we believe it will have an effect.

Navveen Balani: I totally, I would say, especially with the foundation, right? It's all our collective journey that we've gone through. I mean, we started three years back, we didn't have, we didn't have any specification tools, right? Three years down the line, we have the first specification tool. Software carbon intensity specification, which is now an ISO standard.

We have various tools now, carbon SDK, impact engine framework. And I, I know that 1230 projects already in the pipeline on the foundation, right, which will make the world, I would say, a better place, right. In terms of sustainability, right. For all the work that we do. So, yeah, it's basically a shared responsibility, right.

Climate change is basically a shared responsibility, right. And from our perspective, developers, all we can do is contribute, right, by using the three SCI principles, which I talked about, right? Which I again repeat is either write better energy-efficient code, use hardware wisely, and make applications carbon-aware.

Anne Currie: And of course, actually, not just write it, cause you might not be the one who's writing, it's more important that you use it. So as I'll, I'll constantly, cause Python is such a good, good example of the moment, at the moment, because of their big, big revising stuff. If you people who upgrade the latest versions of Python that are much more, more efficient will be saving a lot compared to people who don't upgrade.

And that's, those kinds of things are the kind of things that will immediately be unearthed by running the SCI like for like. I mean, a really big change might be on your like for like is that you upgraded to more recent versions of a particular library or a particular set of tools that you're using that are more, more efficient.

SCI isn't just about what code you write, it's about what code you use. And that is almost certainly going to be where you get the biggest, biggest value, the biggest return. Anyway, sorry, now I'm doing my, I'm trying to take the last word again, and I'm going to leave the last word to you, Navveen.

Navveen Balani: So yeah, very happy to be part of this podcast. Enjoyed this conversation talking about, I think, three different aspects. I would, I would say. And to all the viewers is, yeah, thank you for listening in and have a good day.

Anne Currie: Navveen, awesome. Thank you very much for coming on this podcast. And a final reminder that all the resources are in the show description below, and you can visit podcast.greensoftware.foundation to listen to more episodes of Environment Variables. And see you all soon at some point, if they ever let me back in again as a guest host.

Good bye.

Navveen Balani: Goodbye. Thank you, everyone.

Chris Adams: Hey everyone, thanks for listening! Just a reminder to follow Environment Variables on Apple Podcasts, Spotify, Google Podcasts, or wherever you get your podcasts. And please, do leave a rating and review if you like what we're doing. It helps other people discover the show, and of course, we'd love to have more listeners.

To find out more about the Green Software Foundation, please visit greensoftware.foundationon. That's greensoftware.foundation in any browser. Thanks again and see you in the next episode.