White Paper

Rear Door Cooling vs. Air Cooling in Data Centres

Rear Door Cooling vs. Air Cooling in Data Centres

Executive Summary

Artificial intelligence (AI), hyperscale cloud platforms and high-performance computing (HPC) are transforming the global data centre industry. Workloads that once operated comfortably at 5–10 kW per rack are now demanding 40–100 kW or more. Traditional air cooling is no longer capable of meeting these requirements without excessive energy use, space consumption and cost.

Rear Door Heat Exchangers (RDHx) offer a proven, scalable and sustainable solution. By cooling servers directly at the rack, RDHx enables high-density deployments today while bridging the industry towards advanced technologies such as immersion and direct-to-chip cooling.

At Arlofin, we see RDHx as a critical part of our design philosophy. Our data centres are built to handle 10–120 kW per rack, leveraging RDHx alongside liquid-to-liquid cooling and modular engineering to meet client demand in London, the UK and beyond.

Background: The Limits of Air Cooling

For decades, air-based cooling systems — typically Computer Room Air Conditioners (CRACs) or Computer Room Air Handlers (CRAHs) with raised floors — have been the default for enterprise and early cloud facilities. While effective up to 20–30 kW per rack, air cooling quickly becomes inefficient and costly beyond that threshold.

To support higher loads, operators must increase airflow dramatically, which requires oversized fans, complex containment and additional mechanical infrastructure. This raises operational costs, consumes valuable floorspace and often fails to keep up with the heat output of modern GPU clusters.

In dense metropolitan markets such as London, where land is scarce and expensive, these inefficiencies make air cooling economically unviable for the next generation of compute.

Rear Door Cooling Explained

Rear Door Heat Exchangers solve this problem by addressing heat at its source. Instead of relying on room-scale airflow, RDHx attaches directly to the back of the rack, integrating a chilled-water coil that absorbs server exhaust heat. Neutral air is then returned into the white space, dramatically reducing the strain on facility-level cooling systems.

  • Passive RDHx can support around 30–40 kW per rack using only server fans.

  • Active RDHx adds dedicated fans, scaling capacity to 50–100 kW or more.

The beauty of RDHx lies in its flexibility. It can be installed rack by rack, enabling operators to scale cooling infrastructure as demand grows. It also works seamlessly within both retrofits of legacy London facilities and purpose-built new data centres.

At Arlofin, our sites are engineered with RDHx as a core design feature, ensuring clients can run dense AI and HPC workloads without compromise.

Air Cooling in Detail

Air cooling remains the most widespread approach today. CRAC and CRAH systems push chilled air into the room, where it is channelled through racks with the help of containment strategies. While relatively low in capital cost, this approach has a hard ceiling of ~15–20 kW per rack.

Pushing air systems beyond this limit introduces rapidly diminishing returns. Energy costs soar due to fan power. Ducting, raised floors and aisle containment consume space that could otherwise be monetised. Most importantly, air’s physical limitations mean hotspots and inefficiencies persist, putting uptime at risk for mission-critical workloads.

For operators seeking to support AI and hyperscale clients, air cooling alone is now a bottleneck to growth.

Comparative Analysis

When evaluating air cooling versus RDHx, the differences are stark.

  • Capacity: Air cooling effectively caps out around 15–20 kW per rack.
    • RDHx supports densities of 50–100 kW+, aligning with AI-era demand.

  • Efficiency: Air requires massive fan energy and airflow management.
    • RDHx targets the heat directly, improving efficiency and lowering OPEX.

  • Space: Air-based containment and ducting consume valuable real estate.
    • RDHx is compact and rack-integrated, maximising usable white space.

  • Economics: While air has a lower upfront cost, RDHx provides superior long-term ROI through operational savings and premium pricing for high-density clients.

  • Sustainability: Air’s inefficiencies increase carbon impact.
    • RDHx lowers PUE, reduces wasted energy and enables heat recovery for district networks.

In short, air is a short-term solution. RDHx is future-ready.

Challenges and Considerations

RDHx does require chilled water infrastructure, which may not exist in all legacy facilities. Retrofitting can be complex and more expensive than sticking with air cooling, though the long-term operational benefits are significant. Maintenance and redundancy must also be factored in to ensure resilience.

At Arlofin, we design sites around these considerations from day one. Our prefabricated modular approach integrates chilled-water distribution, redundant RDHx systems and hydrogen-ready on-site power generation. This ensures our facilities are not just technically capable, but also compliant, resilient and aligned with investor and regulatory expectations.

Future Outlook

Rear door cooling should be seen as a bridge technology, essential today while paving the way to more advanced solutions. Direct-to-chip and immersion cooling will likely dominate at the very highest densities in the future. But these approaches require new server hardware and significant capital investment, making them less deployable in the short term.

RDHx offers operators the ability to:

  • Support AI and HPC workloads today
  • Optimise energy use and sustainability performance
  • Maximise revenue per square metre in constrained urban markets
  • Transition smoothly to liquid-native cooling architectures in the future

For Arlofin, this means building facilities that can deliver immediate value to AI and hyperscale clients, while ensuring flexibility to adopt next-generation cooling when the time is right.

Case Studies and Adoption

The shift to rear door cooling is already underway, with hyperscalers and research institutions leading adoption for AI and HPC environments. These deployments highlight the practical advantages of RDHx and validate its role as the most effective bridge technology available today.

Hyperscale AI Clusters
A leading hyperscaler deployed RDHx across multiple racks to support GPU-intensive AI training workloads. Each rack ran consistently at 40–80 kW, levels impossible to manage with traditional air-based systems. By capturing heat directly at the source, the operator avoided costly overhauls to facility-level cooling, achieving both scalability and resilience without downtime.

Colocation Energy Savings
A European colocation provider integrated RDHx into one of its core campuses to accommodate tenants requiring high-density AI clusters. By moving from air-based CRAC systems to rack-level rear door cooling, the provider recorded 20–30% energy savings in cooling operations. The efficiency gains improved site-wide PUE and allowed the operator to market the site as “AI-ready,” attracting premium tenants.

Research Institution High Performance Computing (HPC)
A German research facility upgraded its legacy HPC hall with RDHx to manage the transition from CPU-driven to GPU-accelerated workloads. The retrofit enabled racks of up to 80 kW while maintaining a facility PUE of 1.2. Beyond performance, the ability to recapture waste heat supported the institution’s sustainability targets, aligning with EU funding requirements for energy-efficient research infrastructure.

Together, these case studies demonstrate that RDHx is not a theoretical solution - it is proven, operational and already delivering measurable benefits in performance, efficiency and sustainability.

At Arlofin, we are embedding RDHx into every new build, ensuring our clients can immediately access these benefits without the risks and delays of unproven cooling technologies.

Conclusion

Air cooling has played its part in the growth of cloud and enterprise computing, but it cannot scale into the AI era. The workloads driving global demand today require solutions that are efficient, compact and sustainable.

Rear Door Heat Exchangers provide exactly that: the ability to run 40–100 kW racks at scale, reduce energy waste and future-proof facilities against regulatory and client expectations.

Arlofin is already integrating RDHx into our UK and European sites, positioning us as a partner of choice for AI-ready, hydrogen-ready and sustainability-led infrastructure. For operators, investors, and clients alike, RDHx is no longer optional — it is the critical enabler of high-density compute.

Engineering the Future of Digital Infrastructure

Physical security, cyber resilience and full regulatory alignment.