When UNUM, the market leading provider of income protection and critical illness policies - that help provide the financial support needed to protect workers and businesses from the impact of illness and injury - wanted to refresh its data centre, it discovered that it could do a better job by improving its own facility. UNUM discovered it could, with new cooling technologies, meet tough corporate green standards.
John Considine, Senior Infrastructure Engineer at UNUM, was heavily involved in the design of the new data centre. “Our existing data centre was 10 years old and designed for a different era of IT.
“We do a lot of batch processing so much of our hardware runs 24x7 which requires a high level of service and reliability. Our data is highly sensitive medical data - which is why we wanted to keep data here in the UK and couldn’t use the facilities of our parent company in the US. A final issue for us was green energy requirements. We needed a solution that would meet our corporate objectives around energy saving.”
With all of this in mind, it might appear that co-location or even cloud would have been cheaper solutions for UNUM. “No,” said Considine, “we’ve had several experiences of using co-location facilities in the past but the service had been variable and we need consistency and reliability and the cost would have been higher.”
“Outsourcing and hosting were options but we had serious concerns over security and service levels. While cloud services are gaining momentum in a lot of areas, there were none that matched our requirements. In addition, our need for 24x7 batch processing meant that we wouldn’t gain from being able to turn off cloud services when we were not using them.”
Meeting tough corporate green standards was a big concern for UNUM. This meant that traditional refrigeration based cooling technologies such as CRACs (Computer Room Air Conditioners) were not going to be suitable as the primary cooling approach. After looking at a number of options, UNUM decided that a fresh air Computer Room Evaporative Cooler (CREC) solution from EcoCooling with a standby CRAC was the most efficient and resilient option.
Having two cooling options means that Considine can be sure that there will be no thermal runaway should there be any failure. “The EcoCooling CREC solution will provide us with adequate cooling even when the outside air exceeds 25C and we could even stretch that to 30C. On the few days of the year when the air temperature may be consistently above that threshold, we have the facility to use CRAC cooling. This is a smart approach by UNUM as few companies have a truly dual redundant system for cooling.”
The new design, with its low energy cooling solution, met the green and operational targets but required a suitable location. The head office in Dorking UK is in a Grade II listed building with limited available footprint.
This doesn’t mean that Dorking was unsuitable for any data centre type operations but as our current production facility is based in Dorking the option to build at Basingstoke reduced the risk and cost. Considine says, “We have high speed links between all our sites and use caching technology to improve performance. While we couldn’t do a new build at Dorking, we are able to use it as a Disaster Recovery site by replicating all key data to hardware we manage there.”
Eventually, UNUM decided to refurbish an existing data centre at their offices in Basingstoke. With a number of large companies already based in Basingstoke there was access to multiple carriers and power feeds - key requirements for any data centre build.
The UNUM data centre is typical of many mid-size, end user operated facilities. A single pod with 26 racks and cold-aisle containment in a conventional layout. The room is designed to support 100kW of IT load. UNUM has ensured that all 100kW has been provisioned from day one. That has proven to be a very smart decision given recent stories around future power risks in the UK National Grid
As of July 2013, UNUM has already deployed over 70% of its data and systems in Basingstoke and has achieved this by only using 30% of the power budget. Unsurprisingly, Considine is happy with this because it means that there is plenty of extra capacity should it ever be required. It also means that the data centre is not only using less power than expected but is meeting one of its key targets, lower energy consumption.
Inside the data centre is where the real work has been done. Advanced cooling, aisle containment and low wattage LED lights have all delivered a Power Usage Effectiveness (PUE) of less than 1.3. Considine believes that over time, that PUE figure will drop even lower. His confidence is based on the commitment of the purchasing team to always buy the most efficient computer equipment and the ability of the EcoCooling CREC solution to deliver high levels of power savings.
Part of the UNUM approach to getting a low PUE was to take a very close look on where power was being consumed across the whole of the data centre. Once the power usage figures were gathered, they were handed to the procurement team who used them to drive the purchase of the latest hardware.
Having exceeded IT equipment power reduction targets, the next green target on the list was cooling. Effective cooling, or in reality, a lack of effective cooling, is one of the areas where a lot of power is wasted. Too many data centres have large hot spots and cold spots - which cannot be properly addressed.
UNUM started by looking at both hot and cold aisle containment options. They decided to deploy cold aisle containment because this ensured that cooling was being delivered direct to the hardware and not lost to the room. Any form of containment requires discipline and UNUM has now put in place strict policies to ensure that there is no leakage of air between hot and cold aisles.
The next challenge was input temperature and according to Considine, “We are still experimenting on the best input temperature. At present, we have set it to 20C and are achieving that without any trouble
Going higher could be considered a risk, especially after a recent ASHRAE report looking at the risk of increased hardware failure as input temperatures. Considine doesn’t see that as an issue for UNUM. “We are working with up to date hardware that is capable of running at higher temperatures. Our hardware refresh cycle means that we replace equipment before it gets into the risk category outlined by ASHRAE. The only reason we are not at higher than 20C at the moment is because we are still in the build out and experimental phase.”
It is not just the cooling of air that adds to the costs in a data centre. Getting sufficient air to a server can take large amounts of power. The EcoCooling CREC solution uses Electrically Commutated (EC) axial fan technology - which moves high volumes of air using very low amounts of power. For example, a forced air solution would use in excess of 40kW of power to deliver the same volume of cold air as a CREC solution which uses less than 4kW of power.
CREC units reduce the power required to cool a data centre by as much as 90 per cent compared to traditional data centre cooling. With UNUM very focused on its power budget and green targets, this means that to cool their data centre of 100kW, they only need to allow around 5kW for the cooling system.
The use of CRAC units as a standby is something that should always be considered when using a fresh air system for a number of reasons. UNUM has a gas fire suppression system that cannot be deployed if a fresh air system is operating. Whilst an F7 filtration system is incorporated into the system, should there be a prolonged external fire then this would be a situation where the data centre reverted to conventional cooling using the standby CRAC.
This raises the risk of thermal runaway - which can quickly overwhelm equipment - causing failure. With CRAC units as a failover option, UNUM are able to quickly deliver cooling to ensure that their systems will not be compromised in the case of a cooling failure.
As with power, the cooling has been deployed for maximum load right from the start. This means that UNUM can load their data centre as required without having to worry about hitting a limit on the cooling and having to wait for additional units to be provisioned. The CREC control system automatically compensates for changing cooling loads. This ensures the minimum energy use at low loads whilst accommodating new equipment installation or peak loads without complex reconfiguration.
With the project on time and due to go live in July, UNUM are extremely happy with their decision to build their own facility. It is already lowering the company’s IT cost, providing highly secure data storage and delivering a PUE that is lower than expected.
While many might think that going it alone and building a private data centre makes no sense today, UNUM can point to a very successful project that has exceeded all their targets.