From the fuel we put in our cars to the source of our homes’ electricity, we are all familiar with the carbon cost of our daily lives – but when was the last time you considered the emissions generated by your code?
Michael Ewald, regional director of Technology and Engineering for digital transformation consultancy Contino, told Information Age that a coding paradigm shift has reignited discussions about building carbon efficient programs.
“There’s a lot of interest right now in exploring more efficient programming languages and why this is important to do so as we enter a new architectural phase,” Ewald said.
“This is the era of micro service, where there’s a plethora or small little APIs each of which requires its own compute functions, which translates to CPU cycles, which means electricity use.
“With climate change and its effects being more prevalent in the news, more people are paying attention and are thinking about conscious choices when it comes to the languages they are coding in and the architectures they choose to build.”
Green energy has also become a point of differentiation between the big cloud providers Google, Microsoft, and Amazon which use renewable energy as a marketing tool to encourage the migration away from on-premises hosting.
Ewald said he has seen a move to languages like Rust, C++, and Go becoming preferable over the likes of Ruby and Python purely for energy efficiency.
Green coding
The energy efficiency of programming languages was explored in a 2017 research paper from the Portuguese Green Software Lab.
Researchers ran 27 languages through a corpus of test programs testing for memory use, speed, and total energy consumption.
What they found was that the fastest languages weren’t always the most energy efficient in each of the given tasks.
For example, when running the fasta benchmarch – a program that generates DNA sequences – the Fortran language was second-most energy efficient while being sixth fastest.
Unsurprisingly, interpreted languages like Python and Ruby tended to perform worse than their compiled counterparts.
Ewing said the ongoing conversation about how to make code more green could change the way people look at existing programs.
“Already we advocate that you should be looking to continually refactor your code after it’s written and deployed,” he told Information Age.
“That means every few years checking to see if it’s still worthy, or if parts are redundant and need to be retired.
“Now I think another dimension to add to this is a consideration about whether you replace it with a different, more carbon-efficient language.”
Communities are popping up around how to make sure the technologies of today mitigate rather than exacerbate the climate crisis caused by the technologies of the past.
Paradigm shift
Last year Microsoft, GitHub and the Linux Foundation along with Accenture and ThoughtWorks launched the Green Software Foundation.
The foundation has created eight principles for green software engineering including directives to build applications that are carbon and energy efficient and to be aware of carbon intensity, which relates to how the energy powering your software is generated.
Another green coding principle is demand shaping which is described as being akin to the eco-mode on a car or washing machine.
Similar functions can be built into code, according to the foundation, such that “when the carbon cost of running your application becomes high, shape the demand to match the supply of carbon” which can either happen automatically or the choice can be given to the user.
A GitHub repository managed by the Green Software Foundation is filled with useful resources for measuring your software’s carbon footprint as well as the emissions produced in cloud environments.
There are even alternative visions for computing that borrow from the world of permaculture.
Early in the pandemic, programmer and artist Ville-Matias Heikkilä published his thoughts on the idea of ‘permacomputing’ as a path away from the current always-on energy intensive mode of computing.
“In permacomputing, intense non-urgent computation (such as long machine learning batches) would take place only when a lot of surplus energy is being produced or there is a need for electricity-to-heat conversion,” he wrote.
“At times of low energy, both hardware and software would prefer to scale down: background processes would freeze, user interfaces would become more rudimentary, clock frequencies would decrease, unneeded processors and memory banks would power off.
“At these times, people would prefer to do something else than interact with computers.”