Nabeel Mahmood believes that chemistry, controls and utility-grade planning now trump incremental servers when it comes to deciding who will win the AI race.
The AI revolution is here, it’s expanding rapidly, and it’s pushing our infrastructure to its limits. As organisations race to harness the potential of generative AI, large language models, and machine learning at scale, one foundational truth becomes clear: AI isn’t just changing what we compute; it’s transforming how we power, cool, and sustain the environments that house these workloads.
At a recent event, I had the opportunity to join a forward-looking panel discussion, AI’s Hidden Challenge: Powering High-Density Environments, alongside Brandon Smith (ZincFive), Shawn Dyer (Vantage Data Centers) and our host Stephen Worn (DCD). What emerged was a picture of an industry at an inflection point. The questions we wrestled with weren’t just about power, cooling or cables; they were about rethinking the very DNA of data centre design.
This isn’t just a technology story. It’s about chemistry, electrical engineering, grid economics, and systems-level thinking. It’s about collaboration across disciplines and breaking down silos. And it’s about preparing for a world where power density isn’t the exception; it’s the baseline.
Why AI demands a new infrastructure mindset
Let’s start with a simple truth: AI workloads scale differently from traditional IT. Where once we worried about server density in kilowatts per rack, we’re now talking about megawatts. That kind of thermal and electrical load exposes the inadequacies of legacy architectures built for virtualisation, not for vector processing or massive parallel training.
As Stephen Worn put it, “AI isn’t just another workload; it’s a demanding tenant.” It’s a tenant with unpredictable consumption, heat spikes, and sub-millisecond tolerance for power fluctuation. And it’s not just moving in – it’s taking over.
This calls for a redesign that begins not at the rack or the room, but at the grid level. From power distribution and UPS systems to backup strategies and thermal design, everything must be reevaluated for a future defined by extreme density, real-time responsiveness, and sustainable scalability.
Powering the future: the core challenges
So, what’s really at stake when we talk about high-density AI infrastructure?
Power distribution under pressure
Delivering 500kW+ to a single rack requires much more than bigger cables. It demands rethinking transformer capacity, feeder design, electrical room layout, and current management. Conventional electrical infrastructure starts to break down at these levels. Efficiency, safety, and space utilisation all come under pressure.
Backup systems that scale smarter
Traditional UPS systems don’t scale efficiently for this kind of density. Bulky battery cabinets eat up space, add complexity, and can become bottlenecks. The new paradigm? Backup at the rack level, with chemistry and control systems designed for microsecond response times.
Cooling: A new thermal reality
Heat is the silent killer in high-performance compute. AI environments don’t just run hot; they swing fast. Avoiding thermal throttling or component damage requires cooling systems that react dynamically, evacuate heat aggressively, and adapt to workload shifts in real time.
Resilience that thinks in microseconds
Downtime in AI is more than an outage; it’s a lost training cycle, corrupted model, or missed opportunity. Resilience in this context isn’t just about redundancy; it’s about reaction time. We need systems that operate on the same timelines as the workloads they protect.
Chemistry meets infrastructure:
One of the most compelling parts of our discussion came from Brandon Smith. He showcased how nickel-zinc (NiZn) battery technology could offer a promising alternative to both lead-acid and lithium-ion. Why?
- High Power Density – NiZn delivers fast, high-output power bursts – perfect for bridging power disruptions in milliseconds.
- Thermal Stability – Unlike lithium, NiZn carries minimal thermal runaway risk, making it safer for dense deployments.
- Environmental Responsibility – NiZn uses abundant, recyclable materials with a lower toxicity footprint.
As Smith noted, “NiZn cells endure extreme temperatures and charge/discharge cycles, exactly what high-density AI racks need.”
And it’s not just about chemistry. The design of the modules at rack-level, compact, and thermally efficient allows for localised resilience without sprawling infrastructure.
This kind of innovation reframes the resilience conversation. It’s no longer about how long a battery can run, it’s about how fast it can respond and how efficiently it can be deployed in a space-constrained, high-performance environment.
Grid to rack: The utility-level implications
Shawn Dyer of Vantage brought valuable perspective on the macro side of the equation on how the compute demands impact data centre site planning and utility relationships. He made it clear: AI at scale changes everything.
- CapEx Complexity: High-density AI nodes often require substantial utility upgrades. Think transformer replacements, substation enhancements, and new feeder installations.
- Real Estate Reimagined: As rack density rises, electrical infrastructure – not server count – becomes the spatial constraint.
- Grid Partnership: Building next to a substation or renewable source used to be a nice-to-have. Now it’s a survival strategy.
When your power demand spikes can tip the scales of a local grid, utility coordination becomes a strategic imperative not a line item.
Systems thinking: The new competitive advantage
One theme kept surfacing during the panel: point solutions aren’t enough. To meet the demands of AI, we need systems thinking across layers, disciplines, and time horizons.
- Chip-to-Kilowatt Mapping: Engineers need to trace compute load from silicon behaviour to facility power draw and thermal dissipation.
- Real-Time Controls: Load shedding, battery dispatch, and cooling adjustments must all be automated and adaptive.
- Integrated Monitoring: Data centres must act as closed-loop systems with visibility down to the rack and ideally, to the chip.
In a sense, the infrastructure must become intelligent; just like the workloads it supports. Data centres are evolving into living ecosystems, where compute behavior and physical response are tightly intertwined.
The sustainability imperative
We’d be remiss not to talk about the environmental angle. AI is energy-hungry by nature. But it doesn’t have to be wasteful. Thoughtful integration of chemistry, controls, and modularity can dramatically improve sustainability:
- Eco-Friendly Battery Chemistries: NiZn offers a safer, greener profile than traditional lithium systems.
- Smaller Footprints: Rack-level power and cooling reduce the need for sprawling mechanical rooms.
- Grid Harmony: AI clusters can actually support the grid, acting as virtual power plants during demand spikes.
From my perspective, the advantages of nickel-zinc batteries go beyond just technical specs, they represent a smarter approach to infrastructure. NiZn batteries are recyclable and don’t require active fire suppression systems. That translates to less supporting infrastructure, reduced emissions, and fewer downstream complexities when scaling high-density environments. It’s a more elegant and sustainable path forward.
I also emphasised during the discussion that renewable integration can no longer be viewed as just a sustainability checkbox; it’s the core part of economic planning for AI data centres. With the right architecture, AI clusters can shift loads intelligently or even push energy back to the grid during peak demand. That’s not just good for the environment, it’s a strategic advantage in today’s volatile energy markets.
From vision to deployment: Bridging the gap
Of course, vision is one thing. Deployment is another. Making this future real means confronting a series of practical hurdles, let’s start with:
- Pilot Programs: We need more proof-of-concept deployments to test new chemistries and topologies under real-world AI loads.
- Standardisation: Industry-wide standards for rack-level power modules and density zones will help accelerate adoption.
- Regulatory Engagement: Partnerships with utilities and regulators are key to streamlining upgrades and incentives.
- Education & Evangelism: End users, developers, and even investors need to understand why AI resilience goes beyond SLAs.
Without this kind of groundwork, the risk is that innovative solutions remain trapped in the lab or only accessible to hyperscalers with massive budgets.
Envisioning 2030: The AI-ready data centre
So what does this all point to? Here’s a realistic, aspirational view of what AI-ready infrastructure could look like by the end of the decade:
- Hybrid Power Architectures: Combining traditional grid feeds, on-site renewables, and modular battery systems.
- Resilience by Design: Low-toxicity chemistries, automated failover, and microsecond response baked into every rack.
- AI-Managed AI Infrastructure: Neural networks monitoring and adjusting the environments they run in.
- Standardised Density Classes: Defined rack zones (25kW, 50kW, 100kW+) with standardised interfaces.
- Ecosystem-Centric Ops: Facility teams evolving into systems managers – balancing electrons, BTUs, and AI cycles simultaneously.
This isn’t fantasy. The tools are here. What’s needed is alignment, collaboration, and the will to architect beyond the legacy playbook.
AI is rewriting the rules of nearly every industry; from medicine and media to manufacturing and logistics. But behind every breakthrough lies infrastructure. Behind every model sits a rack. Behind every rack, a grid. Behind the grid, chemistry, and control.
We can’t let the conversation about AI be dominated by algorithms alone. The physical layer matters. Those who understand and invest in it will be the ones who shape the future not just of Information Technology, but of society.
As we said during the panel, “The one who controls the electrons controls the intelligence.”
The message is clear. AI transformation isn’t about plugging GPUs into old frameworks. It’s about rethinking the entire system from electrons to enterprise. We’ve only scratched the surface. But the direction is set. The infrastructure of the future isn’t just bigger or faster it’s smarter, safer, and more sustainable.

