Playing our part

Lex Coors, chief data centre technology and engineering officer at Interxion, outlines why sustainability is ‘critical’ to the future of data centre business.

Sustainability is a challenge every industry is grappling with at the moment. It has not only become a business-critical issue, but it’s also one that is capturing consumer attention and impacting purchase behaviour. 

Such is the weight of the issue, that we have seen major global protests in recent weeks urging individuals, organisations, and governments to do more.

The data centre industry is no different. We have our role to play in ensuring a sustainable future. In fact, some would argue, that given the amount of energy the world’s data centres consume, we have a significant responsibility to address sustainability challenges. I am one of those individuals. It’s not a straightforward task, I admit, but there is a lot more we can be doing.

As the rate of digitalisation continues to increase, so too does the energy needed to power our ever more connected world. Current research estimates global data centre electricity demand at 420 TWh, or around 3% of global final demand for electricity. And while newer infrastructure technology and more efficient data centre configurations are currently able to offset the increase in energy demand, this is not something that can hold indefinitely.

Customer demands

Energy is a fundamental component of the services this business delivers to customers. We’re already using 100% renewable energy to power our data centres, because it’s something customers demand and expect us to deliver on. For many customers, if the data centre isn’t powered 100% renewably, they don’t even want to hear anything else about it – they’re not buying.

We are fortunate that, as an industry, there are resources and people to talk to about these issues, learn from and share best practice. The Technical Committee and the Advisory Council of The Green Grid, the leading energy efficiency and sustainability association for the data centre industry, as well as the EC Joint Research Centre on Sustainability are invaluable in that regard. They’re not just talking shops, but collections of like-minded people coming together to tackle a common problem.

But this still isn’t enough. New challenges continue to present themselves as the industry continues to deploy more complex, data intensive workloads to solve more problems and answer bigger questions. We don’t yet know how much this will impact overall energy use.

Cool running

One of the key reasons for such high levels of energy use in data centres is cooling. Moving data around between compute and memory is an energy intensive process, but it’s also inefficient and much of that energy is lost from the system as heat. Because of this, and the fact that data centres contain hundreds, if not thousands, of server racks all moving huge amounts of data around, this heat is a major issue.

As we all know, most data centres are therefore carefully designed to regulate temperature, often in naturally cool locations – in colder climates, underground – and with sophisticated systems to ensure temperature doesn’t deviate from an optimum window.

There are several different ways to approach the issue of cooling. New and innovative solutions such as liquid-to-chip cooling are starting to become increasingly common. While liquid and electronics don’t usually make the best bedfellows, data centre engineers are now beginning to realise the benefits of liquid cooling systems. It is little wonder considering the relative efficiency of air/liquid heat exchangers.

But improved cooling and moving away from small, inefficient data centres towards much larger and more efficient cloud and hyperscale data centres, will only get us so far.

Architecting change

There is currently some fascinating work being done on new computing architectures which may reduce energy consumption further still. As the race to exascale computing continues at pace, high performance computing specialists are developing systems that can perform one quintillion calculations per second.

The power required to do this with conventional architectures is equivalent to a small power station, so engineers and technologists are rethinking 70-year-old concepts of computing and turning architectures on their head – placing memory, not processors at the centre – to reduce the need to move data between components, and therefore the energy needed to power these systems. While these architectures are only deployed for niche use cases today, they could become much more pervasive in the future.

All of these potential solutions have one thing in common. They are all conceived of our own industry. Data centres are used by so many businesses, sectors, and industries that should also play a part in addressing this issue. Large corporates are already investing heavily in renewables, energy efficiency, and sustainability initiatives, but this is still barely moving the needle.

Finding answers

As the industry continues to grow, evolve, and use ever more power, I believe our institutions need to play their part and contribute to a wider collaborative effort to solve this shared challenge. There are numerous ways this could happen. One option, I believe should certainly be explored, is an EU-funded programme of academic research to explore long-term, energy-efficient storage technology. This doesn’t necessarily have to be battery technology, but should aim to help tackle the issue of availability of supply of renewable power, the oversubscription of which could lead to significant cost increases as demand continues to rise.

Only a forward-looking programme of this size, scale, and innovation is likely to deliver the meaningful results we all wish to see.

Related Articles

Top Stories