This provocative statement from Tristan Nitot highlights the pivotal role of software engineers in our journey as an industry toward a sustainable and more frugal digital world. The majority of our old devices, from smartphones to desktop, still work. How come that we waste such a massive gathering of precious resources such as minerals, energy, water and even our human time which we use to manufacture and maintain them? What should we do to break the trend of electronic waste and the ever-increasing footprint of the IT sector on our physical world? Gael Duez sat down with Tristan Nitot to start answering these questions in this Green IO episode where we covered: ⚖️ Why Wirth’s law matters more than Moore’s law 🗑️ How the Auvergnat cultural aversion for waste accelerated the birth of the Eroom’s law 🔎 How to find (sustainability) weakness in your software, 🐍 Why Python is not (always) guilty of being resource hungry and how to embrace a wise use of alternative libraries such as Polars, ✨The real ROII of using AI to optimize software.
❤️ Subscribe, follow, like, ... stay connected the way you want to never miss an episode, twice a month, on Tuesday!
📧 Once a month, you get carefully curated news on digital sustainability packed with exclusive Green IO contents, subscribe to the Green IO newsletter here.
📣 Green IO next Conference is in Paris on December 4th and 5th. Every Green IO listener can get a free ticket using the voucher GREENIOVIP here. A small gift for your huge support. 🎁
Learn more about our guest and connect
- Tristan’s LinkedIn
- Tristan’s website “L’octet Vert”
- Green IO website
- Gaël's website
Tristan's sources and other references mentioned in this episode
- Niklaus Wirth’s article "A Plea for Lean Software"
- Moore’s law (wikipedia)
- Greenit.fr
- Matt Parker's video "Someone improved my code 408 000 000 times better"
- Optimisation de performance bénéfice ou sacrifice ? (FR),
- WeDoLow (software optimization)
- Polars library
- The Digital Collage workshop
- The Biodiversity Collage workshop
- Tom Fishburn’s cartoon about AI and email
- Tristan Nitot’s “La loi de Moore est morte et c'est une bonne nouvelle” (FR)
Transcript (auto-generated)
Tristan Nitot (00:00)
basically if you manage to double the speed of your software every other year, then you don't need to change your hardware. Because every other year, you have like Moore's Law, you're liberating half of your computing power and resources. And so you have half of it which is available. And half of it means you can invent
new stuff. And I call it Erooms Law.
It's like more in reverse, because it's basically the same.
Gaël Duez (00:32)
Hello everyone, welcome to Green IO
I'm Gaël Duez and in this podcast, we empower responsible technologists to build a greener digital world one bite at a time. Twice a month on a Tuesday, our guests from across the globe share insights, tools and alternative approaches enabling people within the tech sector and beyond to boost digital sustainability. And because accessible and transparent information
is in the DNA of Green IO, all the references mentioned in this episode, as well as the full transcript, will be in the show notes. You can find these notes on your favorite podcast platform and of course on our website greenio.tech.
Gaël Duez (01:19)
It was a late evening in December 2021 in my small Parisian flat. The wind was blowing and a bit of rain was making me feel very comfortable being at home. I was staring at my living room wall, which happened to also be my kitchen wall. I told you this is a Parisian flat. And on this wall were dozens of post-its with voting dotes and names on it, such as Take It Green,
Green Coding, Green Technologists, Eco IO, well, as
I was on the final stage of finding the names of this podcast. The final battle was between Green IO and Green Bytes. And here comes a surprise, the winner was Green Bytes. But Tristan Nitot had been broadcasting for years a successful podcast in French named
L'Octet Vert, the green octet. And I wanted his permission to use a similar title in English. And he told me, I'd rather not. Back in those days, he had plans to expand the podcast internationally. And after he told me this, we talked for an hour. And the amount of tips and support he provided me was mind blowing.
This call was one of the best investments I made while launching the Green IO adventure. So Green IO won and as far as I know from your feedback, that's a pretty cool name as well.
to have him on the show to celebrate this milestone. But we don't have Tristan today with us only for celebrations. We have him to discuss how we will run weather forecasts in 10 years from now on an Amstrad computer from the 80s. OK, maybe I'm exaggerating a bit here, but it illustrates well the concept that Tristan has been pushing relentlessly for a year now about his Eeroom's law and the fact that more law is dead.
This focus on sustainability in Tristan's life started 15 years ago in parallel of a successful career in Mozilla Europe,
which he co-founded, then Cozy Cloud as a CPO and Quant as its CEO. He now works at Octo Technology to work on digital sustainability and frugality, but Tristan is also well known for
being one of the strongest voices in Europe for open source software and privacy, a topic he even wrote a book about, surveillance. Welcome back on the show, Tristan. It's always a pleasure to have you here.
Tristan Nitot (04:09)
Bye.
It's a pleasure too and an honor.
Gaël Duez (04:14)
let's get started right now. And I've got a question for you, Tristan. From podcaster to podcaster, just between you and me and a few thousands listeners. Did you kill it? Did you kill Moore's law
Tristan Nitot (04:30)
No, it's not my fault. It happened. Well, I think they say that trees don't grow up to the sky. They have to end sometimes. And has experienced a fantastic run for more than 50 years. But it's pretty close to being
It may be a shock for many people because we've all been living with Moore's Law around us. And to the point it's a fact of life. And to the point we forget about it, it's just like Next year's computers are going to be faster, except that it's slowing And they keep improving stuff, but more Law basically is almost dead.
Gaël Duez (05:22)
and for the listeners not that familiar with the concept of more law, could you just describe it in one sentence?
Tristan Nitot (05:28)
Sure, of course. It was created by Gordon Moore, who was a co-founder of Intel. It pronounced in 1965 in the very early days of semiconductors. Moore realized that basically because they were making so quickly
progress about manufacturing microchips and semiconductors, they were able to double the number of transistors they could put into a semiconductor, basically a computer chip, memory, processor, and stuff. And if they could manage to double the number of transistors per chip every two years, like they've been doing,
And it would be fantastic. And so they decided it's not a physical law, is a programmatic decision of investing in research and development in order to double the number of choices store every other year so that microchip will double in computing power every two years. And they managed to basically pull it off.
for more than 40 years. But now, starting in the mid 2010s, it's really slowing down. they managed to put it off for 40 years, which is amazing.
Gaël Duez (06:51)
which is amazing for a non-physical law. As you stated, it's almost a research and investment program rather than a law, followed by humans at Intel and then everywhere around the world in the semiconductor
Tristan Nitot (06:59)
Yes.
Gaël Duez (07:06)
Now I'd like to ask you a question about whether it's really dead or not. Because someone would argue that with the current trend on GPU chips achieving an etching smaller and smaller, we're talking about two nanometers now, the computing power intensity of our chips is still going up. So dead or not dead, Mauslo, or just being rewritten somehow?
Tristan Nitot (07:32)
Yeah, that's a really good question. Basically, Moslo was interesting because you could make a microprocessor core go faster. And as we write a program, it's executed sequentially with loops on one single core. That was the model from the very start, and it lasted
for a very long time. And if you look at the improvements year over year of the speed of the power of a single core microprocessor, would see it increases very almost in a way. Well, it's a.
It's linear if you have a logarithmic scale. So it really doubles every other year. Actually, sometimes it goes faster than that, and sometimes a little slower, depending on the period of time. And then the curve is making a plateau around 2012 in terms of the speed of one single core. And basically,
If you could put it that way, you could add more transistors into a microprocessor, but at some point, it doesn't make it any faster. It's plateauing, and it's over.
So what they manage to do is they multiply the number of cores. And this is why you have several cores in a microprocessor. But it completely changes the model. And cores is a lot more complex than programming a single core, because you have to synchronize each core altogether. So the complexity is increasing really fast. It's mind blowing.
And so it's a lot harder to manage. So what we see now is with AI and neural networks, we can have a solution to basically split the computing neurons and each core being like a neuron.
But this model is very different from programming like an application or something. It really is about LLMs and AI and stuff like that. Well, you can use thousands of core and being efficient at using them. But the single core thing, it's really over. are slight improvements like Apple did it with the Apple Silicon because they started integrating everything all at once in a single piece of silicon, putting
the processor and the memory and also the GPU and stuff like that, all in the same box, like a tiny box. And so it's faster mostly because the connectioners are shorter and it's more efficient. And as you can see now, even Apple Silicon is plateauing.
Gaël Duez (10:26)
So dead or plateaued, but in any case, significantly slow in down. This is your main takeaway of what is going on at the hardware level of our IT industry, if I understood you right exactly.
Tristan Nitot (10:41)
Yes, yes.
Gaël Duez (10:42)
And in parallel of more law
there is a second law that you love to quote a lot and to put into perspective with more law, which is Weir's law. Can you elaborate a bit on it and why you like to put these two in parallel?
Tristan Nitot (10:57)
so basically there is this famous German Swiss computer scientist which name is Niklaus Wirth.
And so he's a very, he's almost a genius. He's got several prizes for his work. He has invented several innovative languages like Pascal. So he's a really top-notch computer scientist and researcher.
so Niklaus Wirth was actually pretty unhappy about the state of affairs in computing.
because what he saw is that the more someone has a powerful computer, the more lazy the person is when it comes to writing software. So his law is said, software is becoming slower, faster than hardware is accelerating. So the more power you have, more hardware power you have, the more you
misuse it and the lazier you become when writing
Gaël Duez (12:02)
I think it was coined in a different way in the 90s when it was stated like what Intel gives you, Microsoft takes it back.
Tristan Nitot (12:12)
Indeed, yes. It was a popular way to summarize the VEOT slow. And it's exactly that. What Intel gives you Microsoft takes back. And in the time where computing was Intel making the hardware and it was Microsoft delivering the software, you could see that, in fact, you get a new generation. And you see that Windows and Microsoft Office and
They get slower. They get more features, but they get slower. And Intel was super happy about that because it will enable them to sell new computers and new processors and everything. And they loved it. So it was nice.
Gaël Duez (12:54)
So now Gordon Moore is dead. Maybe his law is dead or at least platooning. are still pretty unhappy with the way we run our software. I people are still changing hardware mostly on software reason or on psychological obsolescence, but that's a completely different topic. And a few years ago,
came up with an idea about a new low, E room's low. And that's the reason you're on this show today.
could you tell us a bit more, how did you get this idea of Eroom's law?
Tristan Nitot (13:28)
I wish I could tell you that one day an apple fell on my head and I had this wonderful idea of Irrumsloh. Yeah, but it was a Mac, not an apple. It hurts quite a bit. Just kidding. And so, no, no, it was actually pretty painful. I knew about Morsloh and Wiertzloh.
Gaël Duez (13:33)
I've heard this story already.
Ha ha ha ha ha ha ha
Tristan Nitot (13:51)
for a long time and then it came back when I had a conversation with Frédéric Bordage, who is one of the godfather of green IT in France and he mentioned that in a training that I was participating to. And it was nice having my memory refreshed about it.
And so I used this. was trying to do giving talks here and there because I thought that we needed to do something about global warming.
biodiversity collapse and stuff. And I was trying to ways to introduce people to the issue that we should stop wasting competing power. to be honest, I wasn't very successful because basically my idea at the time was that people should give up on
new things all the time. And maybe we should cut down on the amount of computing resources we use. And nobody was ready to hear that. And I understand that. mean, being myself a computer scientist and being myself like in love with the progress and advancement, technical advancement of computing,
It's like everybody in the industry has this feeling that we are going to do more computing in the future and not less,
And so I wanted to explain to the people that computing is not free. Of course, it's getting cheaper and all of that. But it has an impact when it comes to digging mines in many places of the world and pollution and energy consumption and greenhouse gases and stuff.
And they were, I mean, they were not exciting conferences. People would get unhappy And suddenly I realized that there were people that could do optimization of software in amazing ways. And I mean, a friend of mine had written something in Python and realized it was
taking too long for his program to run and decided to optimize it. In just half an hour, he managed to optimize it by a factor of 60. He made his program run 60 times faster.
You could put it in a different way. Until then, he was wasting 60 times too much computing power to achieve the results he wanted to get from his computer. and I am from a region in France, which is a poor region, historically, which is Auvergne. And so people in
they want to save money. wasting resources is just unacceptable.
And it happens, but my friend is also from Auvergne. And he said exactly, he felt exactly the same way. Like, what? I've been wasting 60 times. So he felt ashamed about it. And so I kept researching about optimization.
And there was that one thing that I discussed when I was last here at Green IO Paris 2023 of one guy in the UK named Matt Parker, who made a fantastic video about his code being improved by a factor of 408 million. So he was basically wasting 400 million times
too many power to solve a problem, which actually was not a really interesting problem, by the way.
And he wrote a Python script And this Python script has been running for 32 days to find one solution. And he made a podcast about it.
and people starting making fun of him because really 32 days to solve such a basic problem was way too much. And somebody decided to rewrite another version and made it run in 15 minutes instead of 32 days. so Matt mentioned that in another podcast.
And so other people say, what? 15 minutes is way too slow. I can do better. And so they started first competing and then collaborating altogether to share the best tricks like, no, you don't want to use Python. You want to use Rust. Or no, you can use C. And no, you can use Assembler and all of that in order to go faster.
And one of the latest version, which is not the fastest version, now runs in 6.8 milliseconds instead of 32 days. And so that's more than 400 million times faster.
So basically, overall, what it made me realize is that in most cases, when someone writes software,
We don't really care about quality and the need of resources. And sometimes it's OK. Maybe we could rewrite it. We could redo the work. We could rewrite it and gain maybe 20 % performance. And we don't have to do that. mean, doing the work twice in order to gain 20%, it's not worth the thing. But if you can.
manage to make it 100 times faster, then it makes sense. And basically, the issue is finding where in your code base you behaved like Matt Parker and where you have written a terrible piece of code
So we need to spot that and fix it.
basically if you manage to double the speed of your software every other year, then you don't need to change your hardware. Because every other year, you have like Moore's Law, you're liberating half of your computing power and resources. And so you have half of it which is available. And half of it means you can invent
new stuff. And I call it Erooms Law. It's like more in reverse, because it's basically the same.
Same thing is more law, except that you don't need to change the so does it make a big difference? It does. Why does it? Because actually,
the manufacturing of the hardware
is the thing that has the biggest impact by far because basically it's just not electricity, electricity which can be green electricity by the way. No, this is basically diesel caterpillars being used in some foreign country whether it's South Africa or South America.
Gaël Duez (20:48)
layer.
Tristan Nitot (20:49)
and digging mines pushing earth and soil in huge amounts, and literally tons and tons of these, And then taking all of this and putting on a chip sent to China, which is going to go through, well,
all the oceans for months and stuff like that. And then in China, it's making the chips and the hardware and the steel that goes around it. So basically, when you see your smartphone arrive in front of you, before you switch it on, of the footprint and most of the damage has been done. And so the biggest thing it can do
as a person is make your hardware last longer. And with Eroom's law, we can make it last longer because we will remove the need to change the hardware because we are optimizing software.
Gaël Duez (21:38)
And that's.
that's absolutely true because what you've described is super well described in the digital collage workshop, all this visual with big mine and all of this. you really see that, especially for end user devices, like between 70 and 90 % of the oncological footprint happens before you start using it, as you rightfully said. And
What is very interesting most of the time when people are starting to get aware of it, they get past the psychological obsolescence, like the need to get the latest and shiniest iPhone, laptop, earplugs, whatever. they start willingly to keep their hardware longer and longer. But the problem most of the time they say it's like, comes with a cost that more and more and more softwares will not be available or will not run properly. So it's kind of a
double side push, I would say that crunch any good willing, environmentally aware IT users, which is like, okay, so I've got this big marketing push that, hey, you need the latest stuff, you're not in, you're not cool, you're not whatever, or you're not even a productive enough worker. On one side and on the other side, actually, you've got this
hardware slash software push And I think we're focusing a lot on the marketing push,
But with your E-Rooms low, we open the Pandora box of this other push, is Obsolescence and
the very strong link between software and hardware and how actually is driving most of the hardware obsolescence. So thanks for this clarification. And I Emmanuel talks that was at BraceCamp actually. And that was very interesting because he was using a
Tristan Nitot (23:35)
Hmm
Gaël Duez (23:37)
very largely used software as an example, which was COVID Tracker, if I remember well, and how all the architectural choices that were enabled the applications to be super small and super efficient to run, despite serving millions of calls per hours at the peak of the pandemic. And this leads to my next question, which is,
If I get the concept right, as you described it and why it is important, as a software engineer trying to lower the environmental footprint of my code, starting with the fact that my code should be made in such a way that it could be used on hardware older and older and older, how can I leverage this concept? How can I make it concrete?
Tristan Nitot (24:29)
Well, it's something I'm working on at Octo, but I certainly hope that other companies are going to embrace to with or without Octo.
And they will implement a methodology that I'm trying to write with the community, which is basically
how you find weaknesses in your software. And by weaknesses, I mean the pieces of code that are terribly under-optimized, how you spot them because you have millions of lines of code in your information systems. How do you know where they are? Maybe you already know.
Maybe your Cloud Monthly bill is helping you spot them. Maybe you have a FinOps efforts underway that could help spotting them. then put up a task force with probably senior.
Gaël Duez (25:23)
Spot the culprits.
Tristan Nitot (25:39)
developers that will come and audit the code and spot where the issues are. You will be using tools like profilers that will inspect the code and say, this place is basically taking 98 % of the time and the CPU power and the memory. It's being used right there.
If you use a profiler, you would see flame graphs. You see the heat, if you will, where basically the power is consumed within the code. And so it's there where you want to intervene and send the task force and fix the issue. And so we're working on creating a methodology to spot the places and fix
the issues where they are, in order to basically implement these rooms law at information systems.
Gaël Duez (26:37)
And what are the top three takeaways that you've found so far? Is it more in the architecture Is it more on the way we code? Is it more the language we choose for different tasks? What are the main takeaways that you've found while working with your colleagues so far?
Tristan Nitot (26:56)
There are basically several buckets. For some reason, people think, yeah, you're using Python. So the language is the problem. And it is true that Python is not super fast compared to C++ or C or Brust. It's true. But basically, it's how you use it. Python is amazing.
in having a whole ecosystem of libraries that do a lot of things. for example, I've heard
of libraries in Python called Panda that can be replaced by another library which is written in Rust. So basically you spend most of your time running Rust without even knowing it because you're programming it from Python. so if you use Polars, which is written in Rust, it's a lot faster than Pandas, which is written in Python. So these are the kind of things you can change. So it's a matter of language. It's a matter of libraries. It can be also a matter of
storage where you basically have a database it's an old issue that we all know is basically you try it on a small subset of data and it works and that's fine. But if you end up deploying it and two years after you have three million rows in your database, the problem is completely different by nature.
And maybe you need an index that was not noticeable in the first place because the dataset was small. But now you have millions of rows and without the index, you're wasting a lot of time. Maybe just having an index is going to fix your problem. Maybe it's going to be more complicated than that. So storage is also a problem. But it's something you find in
data science where you manipulate very significant volumes of data. A lot of times you import some JSON or CSV and it's slow, but now you have fantastic libraries that enable you to do this kind of thing but a lot faster and things like that. can be also architecture.
You could have very powerful machines that, on the way they have been set together, they keep waiting for each other all the time. So it's basically very inefficient. And then you need to have a systemic approach to find out what the issue is and that kind of thing. it's hard to say, because IT is fantastic in its diversity. But we can spot that they are.
some families or buckets of problems that we find.
Gaël Duez (29:42)
okay Tristan now I would love you to indulge me to play a bit the devil advocate and I have two questions for the first one is about AI and what you've described spotting the inefficiency in the code rewriting the code etc we've seen a lot of production recently
explaining that with AI and being more precise with machine learning techniques, we will be able to rewrite code in a more efficient way, sports, anti-patterns, et cetera, et cetera. the methodology you've mentioned, the buckets,
you've described. Isn't it ultimately a job of a very well and efficient AI to rewrite all the code of the word and to make us save millions of tons of CO2 and also money?
Tristan Nitot (30:35)
It's a really good question. First, I need to say that I've chatted recently with a young lady called Justine Bono, who happens to be French, and has a company named We Do Low, L-O-W, like low energy or low consumption, I don't know. She is co-founder of a startup
that does software optimization, automatic software optimization, especially in the embedded world. So stuff that you find on cars and stuff like that. Not really on general purpose computers, on PCs and stuff. And basically, her approach has nothing to do with AI. And she is already really good at doing stuff, like
automated ways to improve your code by 40 with no AI involved. So yes, there are promises there. Now, I must admit, I am very impressed and for several years about what AI is able to do.
it's amazing. it's amazing. But let's be honest. Resource-wise, it's crazy. It's consuming an amount of resources. It's like,
thousands or million times more complicated than running a normal piece of software. It's amazing, but I think the cost is even more amazing. And it's something we don't see. Like earlier on, you mentioned digital collage, where they talk about the mines and stuff like that.
It's something we don't see and we never talk and you don't see minds in Apple advertising about the new iPhone, right? Because that's not something they want to see. You want to see a pristine white box with an all new shining hardware. That's what they want to see. It's they want to create desire. And so they are not going to show you kids going down in mines. But it does exist still. It's just they don't show it to you.
And exactly the same with AI. It's fantastic what it can do, but it's orders of magnitude more energy consuming than regular computing. And so it's something we need not to forget. I sense, I understand the enthusiasm.
around AI because it's amazing what it can do. But there is a price associated to it and nobody really wants us to know about it. So let's not fall for that.
Gaël Duez (33:12)
So if I follow you here, what you would say is, yes, it's a potential answer to use AI to optimize code, but let's make sure that the cost and especially the environmental cost of using this kind of tool, worse, it's investments
Am I understanding you right here? Yeah.
Tristan Nitot (33:31)
Yes, yes. I think first, I need to see whether AI is going to deliver on its promises of being able to optimize software. Because for now, it's just a promise. And maybe it's not going to be true. We never know. That's one thing. Second thing, we need to sure that we use AI for things that are worth it.
and was the cost and the impact. And third, I'm not sure this is the tendency to use it. Right now, I see AI for helping you
see AI for helping students do their work. if you ask me, because the point is not making them write. It's making them learn. And they learn by writing. But if a machine does the writing for you, you don't learn. It's like bringing a lift to the gym to lift
Gaël Duez (34:24)
You don't know.
Tristan Nitot (34:28)
the weights instead of you. Okay, well, the weights are lifted by the form, but well, you're not making any muscle in the process.
Gaël Duez (34:32)
you
really love this one. I think I'm gonna keep it for my students.
Tristan Nitot (34:41)
So right now, I see this is how AI is used. There is this cartoon that I really like where someone says, you know, I have written this bullet point, and I'm using AI to expand it into a long email. I can't pretend I wrote. But I'm lazy. I don't want to write the whole email. And on the receiving end,
And the person says, there is this long email. I don't have time to read, so I'm going to use AI to sum it up into a bullet point. And that's a well, it's exaggerated, but not so much. think this is really how AI is used these days. And so it's consuming a lot of resources for basically adding no value, other than I pretend I wrote a long email and I pretend I read a long email. If you could only write just a bullet point and set it like that.
it would be a lot more efficient, there would be less noise into the signal anyway, and more efficient in every way. So we should question whether to use AI properly or not.
Gaël Duez (35:40)
That's a fair answer to my first question. But I'm still with my devil's advocate hat on. And that will be actually my last question before closing the podcast. How much of this eRooms law is actually new? just like good old fashioned software optimization with a sustainable angle? Or what is the catch here? Am I misunderstanding something?
Tristan Nitot (36:04)
I think really boils down to a Wirtz law. Basically, the more power we have, the lazier we become as an industry. And if you think about it, even 20 or 30 years ago, which is basically midway from now and the invention of microcomputing,
People knew that they had to optimize. They knew how the computer was working precisely. they knew because they've been hitting the wall of hardware, like the limits of the hardware, you don't have infinite memory. You don't have infinite storage. You don't have infinite computing power. So,
they would be careful and they would learn what to do in order to optimize when they hit a limit of the computer. of course, it took time and energy, but it forced them to become smarter and do smarter things. And it was cool. Now basically, if you don't have enough computing power, basically you fire up another pod. The cloud is going to take care of you. It will scale infinitely.
except that your wallet is not a scale indefinitely. But it's going to cost you a lot, but you don't realize that. And so you don't learn from this. Researching about optimization made me realize one thing is we only optimize under constraint. If there is no constraint, we don't optimize.
And I think we do have two constraints that are with us. The first one is more slow slowing down. It won't keep up forever. That's one thing. And the second thing is basically climate change and biodiversity collapse. We have the moral obligation to do something. I mean, these two things.
climate change and biodiversity collapse are the biggest challenges for mankind.
Basically, Irun's law is saying there are these two new constraints. There is a solution. You can keep building new innovative stuff. But in order to do that, you need to optimize, which is something that we basically have given up because we didn't feel any constraints with computing.
Gaël Duez (38:35)
Okay, so it's not good old fashioned optimization because it comes with a twist about what are the metrics, what are the goals that you're actually chasing and it's not only about financial optimization but also about having an impact and positive impact regarding all this crisis that you've mentioned and also the physical limits that we are about to
got it. Two constraints that are re-incorporated in optimizations to make it much more urgent and top priority. Thanks a lot Tristan for answering all my questions. I've got maybe
Two last ones just to close the podcast. You've mentioned tons of references. All of these references and resources will be put in the show notes as usual and as I mentioned in the introduction. However, is there one final resource or content that you'd like to share with us?
Tristan Nitot (39:30)
listen to Green.io. There are a lot of things. Go to Green.io Paris or the Green.io conference that is taking place near where you live. It's really exciting. And you're going to meet some
new people. So do yourself a favor and listen to more of this information and meet the community. They're very nice people and they're addressing a challenge that deserves to be addressed.
Gaël Duez (40:00)
Well, thanks lot, Tristan. I'm going to hire you as a salesman. But I'm going to mention also that people should keep on listening Octave or Fugazia because it's resources in French, but they're also highly valuable. But thanks a lot for mentioning Green IO in such a positive way. And my last question will be...
Tristan Nitot (40:02)
You
Thank you.
Gaël Duez (40:20)
Actually, you've already started a bit by saying something positive about the momentum and the resources we can find, but is it a positive piece of news that you'd like to share about digital sustainability or maybe sustainability at large to close the podcast?
Tristan Nitot (40:36)
I think every organization is understanding more and more that they need to do something. In Europe, there is the CSRD, Corporate Social Responsibility Directive, that is basically forcing organizations to publish every year a report on how they are improving their corporate social responsibility,
is good because it's not only individuals that need to do something. Of course, individuals need to decide to change, but organizations will change too. So I think everybody will get on board and it will become obvious that we will
be applying the changes that we need to apply.
Gaël Duez (41:24)
Thanks a lot for sharing this positive thoughts and mentioning that sometimes regulation does have an interesting impact, positive impact. actually, we're going to see each other in a matter of days on Green Eye of Paris stage to celebrate the podcast birthday and also to have dozens of amazing speakers and panelists to talk about.
green IT and digital sustainability under different angles, from different perspectives, which is how actually we make the world better and how scientific progress is made. So hope to see a lot of you there and we will see you Tristan. So thanks a lot for joining again and thanks for joining the show. It's always a pleasure to have you there.
Tristan Nitot (42:01)
Yes, with pleasure.
Thank you.
❤️ Never miss an episode! Hit the subscribe button on the player above and follow us the way you like.
📧 Our Green IO monthly newsletter is also a good way to be notified, as well as getting carefully curated news on digital sustainability packed with exclusive Green IO contents.