Lenovo helps University of Birmingham cut data centre energy costs

Lenovo

Lenovo is working with the University of Birmingham on a Water Cooling Technology (WCT) project, set to increase the compute power in its data centre, reduce its hardware footprint and cut cooling costs.

The University of Birmingham was looking for a successor technology to its iDataPlex deployment, and modern high-density solutions, coupled with a limited air cooling capacity per rack meant they were looking at installing only two chassis per rack.

They turned to Lenovo for a WCT solution which allows six chassis per rack, saving valuable data centre floor space and reducing energy required to cool the systems. This increases efficiencies, both from an IT and facilities perspective, of the University of Birmingham by 20-25%.

The university project, which is the first of its kind in the UK, is projected to reduce cooling energy by up to 83% compared to using air cooling alone and adds only 4.5kW of heat per rack to the data centre

The water cooling system replaces the typical system fans with an internal and external manifold. Water is delivered directly into the rear of the server to cool it. This allows for a simple blind dock, a quick release mechanism for removing or installing the servers.

Internally, water is pumped through attached heat-sinks on the CPUs, Dual In-Line Memory Modules, onboard components, and IO which transfers heat into the water before being pumped away.

Water typically enters the system at up to 45°C; the subsequent heat transfer from the system components to the water typically results in a water temperature increase of approximately 10°C and, in doing so, reduces the temperature of the system. Lenovo also partnered with Mellanox and OCF to deploy the project.

The system took nine months to develop from initial demonstration, system design and testing to validate all hardware for use with warm Water Cooling Technology. The system will eventually be connected to the University’s central ‘BlueBear’ HPC service and the technology will be used to power the University’s private research cloud deployment.

This project will allow the university to manage 85% of the heat recovery from a single 30kW rack, leaving just 4.5kW of unrecovered heat in its data centre.

Next, the university is looking to add a rear door heat exchanger to the system, which will capture the remaining heat, meaning that almost no air cooling for HPC research equipment will be required.

Related Articles

Top Stories