If you’re wondering how long you have to prepare for the AI age, well, let’s just say the boat is fully loaded and ready to go – you just need to hop on. If you don’t believe me, then you only need to look at the rapid adoption of ChatGPT.
When ChatGPT launched in January 2023, it managed to attract one million users in just five days. Now, with the launch of its latest AI image generation model, it has added an additional one million users in just one hour.
Additionally, the service is quickly becoming the ‘Google’ killer that Bing could only have dreamt of when it launched back in 2009. In fact, there are a whole myriad of articles explaining how people have already ditched Google Search in favour of ChatGPT’s various AI models.
The only problem with the rapid growth in AI, is that it’s seemingly caught some in the data centre industry off guard. In fact, even AI companies like ChatGPT have struggled to contend with its rapid growth, with its most recent surge in users causing issues for those wishing to sign up. The popularity of its latest AI image generation model also led to the company’s CEO to complain that its GPUs were ‘melting’ due to the overwhelming demand.
What does the data centre industry need to do to enable AI’s growth?
Melting GPUs and power constraints are all issues associated with the rise of AI in the data centre space, and thankfully, these issues may be getting more extreme – but they’re building on what we already know about data centres.
Firstly, data centres have always consumed a lot of power, and while rack densities are growing – this is just a more extreme version of our current issues. Thankfully, data centres have already been looking at ways to be more efficient with their power, and are also increasingly turning to renewables and energy storage to produce a lot more of their own energy on-site.
The second major issue – cooling – is requiring the industry to change its behaviours. For years, air cooling has been the go-to for data centres, as it was easy to install and offered more than enough cooling capacity for legacy loads. Unfortunately, AI has essentially made traditional air cooling obsolete, and that’s necessitating the move to more novel cooling technologies – with more and more data centres looking at liquid cooling for their latest AI-ready data centres.
Of course, these are just the tip of the iceberg when it comes to dealing with the transition to AI. There are also regulations to contend with, networking concerns and geographic considerations to take into account – with low-latency connections ensuring that chatbots and voice AI can actually be reliably used by businesses and consumers alike. No one wants to wait an age for an answer from ChatGPT or Google’s Gemini.
How to learn more about solving the data centre industry’s AI challenges
With AI establishing itself as the great reformer of the modern data centre industry, it’s easy to get lost in noise. Thankfully, we’ve got you covered at Data Centre Review.
Firstly, you can read our latest trend report, in association with nVent, that covers the aforementioned issues and outlines some potential solutions to dealing with them in the AI era. You can sign up to that report here.
If you want to go even deeper into cooling, then Data Centre Review recently spoke with nVent during an hour-long webinar where we detailed the constraints of traditional air cooling and the need to upgrade to liquid cooling in order to contend with the needs of the AI age. You can watch that webinar here.
And finally, for those wanting to take a deep dive into all the issues facing the data centre sector in the age of AI, our upcoming Critical Insight event, Data Centres in the AI Era, should have all the answers. You can sign up to that event here.