University of Birmingham to deploy largest IBM POWER9 AI cluster in the UK

University of Birmingham to deploy largest IBM POWER9 AI cluster in the UK

Researchers at the University of Birmingham are set to benefit from the largest IBM POWER9 Artificial Intelligence (AI) cluster in the UK, capable of delivering the optimised performance for AI workloads integral to their research.

Working with OCF, the high-performance compute, storage and data analytics integrator, the University will integrate a total of 11 IBM POWER9-based IBM Power Systems servers into its existing high-performance computing (HPC) infrastructure, the Birmingham Environment for Academic Research (BEAR).

Birmingham initially deployed two IBM Power Systems AC922 servers, powered by POWER9 CPUs with the industry’s only CPU-to-GPU NVIDIA NVLink interconnect, in September 2018. However, the Advanced Research Computing (ARC) team soon realised that it needed more computational power tailored to the ever-increasing AI workloads generated by the University’s researchers, delivering ground-breaking computational vision analysis and to solve life sciences challenges, such as improving cancer diagnosis.

The University will now add an additional nine IBM Power Systems AC922 warm water-cooled nodes, each equipped with four NVIDIA Tesla V100 16GB Tensor Core GPUs, 1TB of system memory, dual 18 core POWER9 CPUs and Mellanox 100Gb EDR InfiniBand.

The solution uses IBM PowerAI Enterprise software, unlocking potential for accelerated computing, capitalising on the largest IBM POWER9 cluster in the UK. IBM will also support use of the new systems by providing comprehensive training and support to Birmingham’s researchers in partnership with ARC.

“It’s very important to us as a research-led institution that we are at the forefront of data research which means we are always looking at ways to make AI quicker and more accessible for our researchers,” said Simon Thompson, research computing infrastructure architect at the University of Birmingham.

“With the sheer amount of data, the common questions from researchers are how can we analyse it fast enough and how can we make the process even quicker? With our early deployment of the two IBM POWER9 servers we have seen what is possible. By scaling up, we can keep-pace with the escalating demand and offer the computational capacity and capability to attract leading researchers to the University.”

This significant enhancement to BEAR will mean an even more powerful and versatile computing environment to serve researchers. For example, fellows from The Alan Turing Institute looking at early diagnosis of and new therapies for heart disease and cancer, will use AI to run faster diagnostics in the future.

In contrast, researchers in the physical sciences are similarly using machine learning and data science approaches to quantify the 4D (3D plus time) microstructures of advanced materials collected at national large synchrotron facilities such as the Diamond Light Source. This research expects to use the large model support provided by IBM PowerAI software to analyse TBs of data being generated daily; currently an almost impossible task.

"We are thrilled that the University of Birmingham has decided to invest in building the UK's largest POWER9 AI cluster”, said Simon Robertson, director, IBM Servers, UK & Ireland. “We are proud to see the practical application of IBM technology used by researchers across the University and beyond."

Energy efficiency is also key, and the University has shown its commitment by investing in the reduction of energy consumption. The University boasts the UK’s first purpose built water-cooled research focused data centre, meaning that 85% of the heat is recovered directly through the water-cooled systems, delivering impressive energy savings by minimising the cooling overheads.

The top end range of IBM POWER9-based AC922 servers includes warm water-cooled nodes, where water is taken directly across the CPUs and GPUs at temperatures up to 35C. A unique installation in the UK, the data centre doesn’t use any air-cooling systems and accommodates the IBM systems running alongside ‘direct to node’ water cooled technology from other vendors.