CRC: A cash windfall

Michael Cook of USystems looks at how to meet your CRC Energy Efficiency requirements with cutting edge technology
Financial directors can earn a cash windfall or, quite possibly, an income stream by utilising energy‐saving datacentre design options – the energy saving can be turned into carbon credits and sold on the worldwide carbon trading system.
Datacentres are potentially in a lose‐lose situation as ever more data is stored from greater use of computers around the world. In keeping up with this growth, and the consequent rise in energy consumption, their owners will also be faced with a carbon tax as governments implement national green strategies.

But by measuring the environmental impact of your datacentre in order to establish its energy consumption and, by deduction, its carbon emissions, you can create a win‐win situation for your datacentre.

From PUE to CRC: PUE (power usage effectiveness) has been widely adopted to measure the ratio of the total power consumed by a datacentre to the specific power consumed by the computing equipment populating the facility.

This evaluates how efficiently the energy is used and shows how much the power consumed by support functions, e.g. cooling and power distribution, is wasted.

The ideal PUE is 1.0, where 100 per cent of the power consumed by a datacentre is for applications and zero per cent is for lighting and cooling. Most datacentres are running with a PUE of over 2.0, indicating that for every watt of IT power, an additional watt is consumed for ancillary stuff – largely for running the cooling system.

This is an obvious waste of energy and therefore, carbon emissions, which are at the heart of the Government’s carbon reduction commitment (CRC) energy efficiency scheme.

The Coldlogik Connection
Whether it be a new build or legacy site, there is one simple fact about datacentres – traditional CRAC cooling and containment cooling is expensive to run and outdated: its place can be taken by state‐of‐the‐art UK designed and manufactured ColdLogik technology, which gives a PUE as close to 1.0 as is practical, with the twin benefits of far lower energy costs and CO2 emissions – plus other savings.

In addition, this cutting edge technology addresses all the key drivers which datacentre communication rooms are measured: energy consumption, carbon footprint, modularity, packaging densities, capital cost and redundancy – factors applying to any cluster, whether low or high density.

Energy Consumption
Take a traditionally designed datacentre, perhaps deploying a conventional hot‐aisle / cold‐aisle configuration on a raised floor. It will typically use 40 kW of cooling for every 100 kW of data power – even aisle containment will only reduce this power ratio by a small percentage.

In comparison, ColdLogik will use less than 4 kW of power for the same 100 kW load – achieving in excess of a staggering 90 per cent year on year energy saving and, therefore, a reduction in carbon emissions.

An ability to grow a datacentre and communications room in a logical and cost effective manner makes obvious business sense – only deploying what is needed for today while being prepared for rapid future expansion. ColdLogik is modular and will allow you to upgrade with masses of flexibility and future proofing.

Over time as datacentre and comms rooms expand or sometimes contract and where third party racks might have to be accommodated, the cooling solution has to meet these ever changing demands. Again, the ColdLogik system, with its retrofit capability to any populated OEM rack, comes into its own.

Packaging Density
Despite improvements made by the majority of manufacturers to the efficiency of their hardware and the promise of more to come, the fact remains that electronic packaging densities will continue to rise. Hot spots in datacentres and comms rooms are common place and when considering that 55 per cent of electronic failure is due to temperature, it’s no surprise that a lot of effort has been deployed in this area to resolve these issues.

By design, the ColdLogik solution eliminates individual rack and localised hot spots, maintaining a constant, room‐ambient temperature. In fact, the enhanced efficiency of the ColdLogik system allows up to an unrivalled 45 kW to be removed from an industry standard 600 mm wide rack, thus enabling electronic packaging densities to increase substantially.

Such an increase in rack density enables a reduction in the number of racks required within a single room, negating the need for additional expensive floor space.

Despite all of the benefits that a ColdLogik system brings, it doesn’t follow that this high‐tech solution costs more than existing, outdated, cooling designs.

Not having to deploy a raised floor or a ceiling plenum are considerable savings in themselves – but the combination of modularity of design and increased packaging density means that the ColdLogik system will actually cost less, not more, than the standard enclosed aisle / in‐row cooling design.

Redundancy
In spite of the critical nature of what datacentres and communication rooms provide, it is crucial that there is affordable redundancy built in – even in its basic form, the ColdLogik solution incorporates high levels of redundancy.

I have briefly, but specifically, covered the major benefits of the USystems ColdLogik system, which is potentially set to become the new standard by which state‐of‐the‐art efficient datacentre design is measured.

To many, it is possible that the ColdLogik system is relatively new technology – it is, though, a tried and tested system, not just in the UK but all over Europe, USA, India and Africa. Want to find out more? A white paper recently written by a consultant about a ColdLogik project, details the capital cost savings when compared to CRAC cooling and in‐row containment cooling; it also highlights the energy savings from the commensurate reduction in CO2 outputs from the local power station. It is available from www.usystems.co.uk/coldlogik/whitepaper

Coldlogik Cooling Ratio
Which is the best metric to identify the performance of cooling systems in data centres?
A simple and uncluttered approach would be to focus purely on a direct comparison between computer power draw and the power draw for the entire cooling solution on part or full load – measured throughout the year.

1MW computer power = 15kW cooling
As the above equation demonstrates, ColdLogik rear coolers enable the ratio of 15 parts cooling to 1000 parts power usage.

Why such impressive energy figures? Because ColdLogik rear coolers use higher than normal water inlet temperatures to remove high and low heat loads, but with much lower energy use.

And when conforming to ASHRAE guidelines, inlet water temperatures can be 24°C – so free cooling from natural water sources, such as borehole technology, rivers, lakes and sea water, can be used to provide a consistent cooling temperature.

Provided an holistic approach is adopted and the data centre designed correctly, it is easy to see why as little as 1.5% cooling power is required to meet 100% computer power draw – regardless of whether running on part or full load.

Even when using adiabatic coolers, dry air coolers or cooling towers, the cooling energy usage figures are still impressive, typically averaging 4% part load, 2.25% full load and with a peak of only 6%.

Coldlogik benefits include:

  • Unique leak free operation with patented Leak Prevention System
  • Low to high density cooling 0.5kW to 58kW
  • ASHRAE Tier 1‐4 compliant
  • Energy cost savings
  • Fast return of investment
  • Save on capital costs
  • Offset carbon emissions
  • Floor space/real estate reductions
  • Optional overhead pipe work eliminates need for raised floor
  • 100 plus prestigious projects successfully deployed worldwide
  • Using the same solution for low and high density scenarios
  • Retrofit existing data centres