Skip to content Skip to footer

Greener tech could help cut carbon emissions for data centres

Melissa Hendry

Melissa Hendry

Co-founder and Managing Director of ddroidd
Share on facebook
Share on twitter
Share on linkedin
carbon emissions

A recent report from The Shift Project highlighted that carbon emissions from the tech infrastructure and data servers that enable cloud computing are now higher than those of pre-Covid air travel.

It also found tech-related emissions are rising by 6% annually. With growing demand for online services, these figures are only going to increase.

And while the figures look worrying, data centres are taking measures to try and reduce their environmental impact. This includes utilising renewable or carbon-free energy, with an objective to be using it 100% by December 2030. There are also improvements in cooling technologies, which cool racks at the source while preventing the cold air from dissipating around the rest of the facility.

Despite these advances, emerging technologies, such as blockchain and AI that require such enormous reserves of energy to function, are slowing progress. Among its various uses, blockchain is the software innovation behind the development of cryptocurrency. In 2019, researchers at the University of Cambridge estimated that the Bitcoin network demanded as much energy as Switzerland.

AI produces similarly alarming figures. Training a single neural network model can emit as much carbon as five cars and the amount of power required to run large AI training models has been increasing with a 3.4-month doubling time. 

Other draining factors

While blockchain and AI are pertinent examples, they are not solely responsible for draining power resources. Everyday, standard websites are also contributing to the huge emissions. The simple act of requesting digital information from an application triggers a data processing chain reaction that involves information being repeatedly pulled from external servers. Where this happens at scale, the energy needed is vast.

Put simply, when a user currently accesses a webpage, the videos, text and images contained are requested from external servers. Each time the webpage is visited, most of this page information is served afresh and discarded once the user leaves. Constant requests mean more servers are needed to store this unnecessary information, and there is more chance of overload as ‘new’ content must be retrieved every time.

However, there are some simple methods that can be deployed to stop this from happening and reduce power being needlessly used.

Eliminate unnecessary energy consumption

Technology has the uncanny ability to provide solutions to the very problems it creates. Using smarter code, combined with an efficient set-up of hosting architecture, some technology providers can establish optimised applications that give businesses full control over their processing power.

Rather than devices needing to continually retrieve the same data, technologies such as the ddroidd A+++ solution recycle and reuse previously processed information, eliminating the need for unnecessary information reprocessing that demands extensive resources. It’s a method that can cut information reprocessing by 90%, reduce a website’s energy consumption, improve reliability and responsibility as well as cut costs.

Indeed, reducing the energy consumption of software can be 100 times more powerful than reducing the energy consumption of hardware.

An eco-friendly approach to data is achievable

Taking a more sustainable approach to software development not only reduces energy usage and can cut costs for businesses, but it can also improve performance of the digital platform itself. This method can be achieved in three simple steps.

  1. Articulate a strategy that guides trade-offs and allows for flexibility

IT teams must firstly calculate the right level of tolerance for their software’s environmental effects. Inevitably, there will be trade-offs between business and environmental goals, and software engineers must determine where the go/no-go line is. For example, AI software requires huge amounts of energy to increase accuracy from say 96% to 98%. Whether that 2% increase in accuracy is worth the added energy consumption is a business decision that requires deliberation. 

For such a strategy to be effective, flexibility must be embedded. Engineers need the space to improvise and learn through trial and error and suggest suitable metrics to measure progress. With software updates, such metrics would be straightforward to set (e.g. by determining how much more energy a newer version would consume). For new software, measures would be more difficult to define but could include memory-use efficiencies and the volume of data used. 

  • Review and refine the software development life cycle

What is the smallest possible environmental footprint we could make with this application? The answer will guide the first stages of the software development cycle. Expectations may shift as further knowledge is gained, but this starting point is an important benchmark for defining the feasibility of objectives. From here, recommendations can be developed for algorithms, programming languages, APIs, and libraries that can be drawn on to minimise carbon emissions.

When it comes to deployment, monitoring real-time power consumption through techniques such as dynamic code analysis is critical for understanding the gaps between design choices and actual energy profiles. 

  • Make the cloud green

Today’s applications are invariably deployed over the cloud, and this has led to an exponential growth in cloud-based services and a rapid expansion of power-intensive data centres. Though renewable energy sources and improved cooling systems are helping to address the problem, implementing green software solutions creates new opportunities to save energy.

For example, eliminating duplicate copies of data or compressing data into smaller chunks saves energy. So too does deploying graphics-processing units to manage workloads at the edge, which creates efficiencies by breaking up large tasks into smaller ones and sharing them across multiple processors. 

Adopting greener server architectures will also likely prove crucial for lowering energy consumption. Using virtual servers, for example, would help companies scale up their servers on demand while conserving energy in enterprise data centres.

Final thoughts

With ambitious targets to reach a zero-carbon future, data centres can play their part by encouraging tenants to use energy efficient servers and hardware, which use minimal power consumption, combined with effective cooling systems and data centre layout schemas, and by educating tenants on e-waste and data optimisation methods. The collective efforts of the data centres and tenants using new and efficient methods, together real change can be seen and measured.

Show CommentsClose Comments

Leave a comment