THE UNIVERSITY OF EAST ANGLIA (UEA) is one of the leading green technology institutions in the UK. Students and staff are encouraged to think about what they do and its environmental impact. As part of this, the UEA runs its own Green Impact Awards every year to encourage innovation in the field of environmental impact within the university and beyond. It should come as no surprise then, that when UEA looked at its existing data centre, it realised that there was much to be done in order to bring it in line with corporate goals. To ensure that they got the best value for greening their data centre refurbishment, UEA approach Salix, specialists in energy efficiency loans.
The result was a partnership between UEA, Salix, Future-Tech and DataRacks to deliver energy efficiency, full pay-back within six years and a facility that meets very stringent targets.
Existing facility
As James Willman, Head of Pre-Sales Consultancy with Future-Tech explained, “One of the major challenges for this project was the existing facility. The UEA data centre is located in a Grade II listed building. This meant that any solution that relied on making holes in walls, moving walls or changing the construction of floor and ceilings was not acceptable. Instead, the solution had to work within the existing space in order to deliver improvements”.
Another restriction was time. As with any university, UEA has research programmes that operate all year round which meant that any large works would have to be planned and executed when the majority of the student body was on leave.
Like most projects, the trouble came in threes. The third was that the data centre is located very close to student accommodation and teaching facilities. This meant that any solution would have to be not just energy efficient but would also have to meet extremely tight noise emission standards as well.
Data centre requirements
As a major research hub, UEA runs a lot of computer equipment. While it is not running a mega data centre, it is representative of a lot of mid-sized company facilities. The existing data centre consumed around 60kW with a fairly typical PUE of 2.08. The existing cooling system was capable of supporting a load of 138kW in an N+1 configuration. Looking to the future, UEA wants to increase the capacity of the data centre in order to take on new research projects and support staff and students.
This means that the cooling load will need to increase by over 60% to 220kW. At the same time, the efficiency of the cooling has to support a dramatic lowering of the PUE to as low as 1.19 when the data centre is fully loaded.
For most corporate data centre new-builds, where a PUE of around 1.4 is more common, a 1.19 PUE would be seen as an extremely optimistic target! What is important here is that the funding for this project was tied to the ability of the Future-Tech and its subcontractors to achieve such a low PUE. This is an approach that enterprise customers should seriously consider.
Equipment updates
Virtually none of the existing infrastructure escaped the refresh. IT equipment was left to UEA to make its own decisions upon but it was keenly aware that it would have to make smart purchasing decisions to support the infrastructure changes. Some things, such as the existing racks, would not change. This meant that there would be a need to create innovative solutions when installing aisle containment.
Power
The entire electrical subsystem was overhauled with new supplies being installed to support the change in cooling equipment and to ensure that there was sufficient power per square foot to support the latest IT equipment. As well as replacing virtually all the existing cabling, new distribution boards were added and all the lighting, both inside the data centre and in the plant facilities were updated to use low power LEDs. A new monitoring system was deployed that makes it very easy for operations to see exactly where the power is being used and to quickly identify any potential leakage or power loss.
Key temperatures
Perhaps the most important part of this project was the change to how cooling is delivered. While there were savings from changing the electrical system, cabling and distribution board, cooling would need to manage a greater amount of heat, be more flexible and use far less power. The target temperatures were set at 22C input temperature with a maximum hot air return of 32C. There were key reasons for these temperatures. Given the historic external temperature around the university, it would ensure that for 60 per cent of the year, the university would be able to rely on free air cooling. For the rest of the year, there would be a need to use some degree of mechanical cooling but the amount would depend on the temperature outside the data centre and the loading on the IT equipment.
Aisle containment
The first major change was aisle containment. After a lot of consideration, the decision was made to go with hot aisle containment which would be provided by DataRacks. With hot aisle containment, the room would be cooled with flooded cold air. However, creating the aisle containment was more complex than expected. The racks in use are Rittal and there was no off the shelf Rittal containment solution that would work.
A further complication was introduced by the IT team’s requirements. They wanted the ability to remove an entire rack when they wanted to work on it and the aisle containment had to facilitate this. This meant that the aisle containment could not be secured to the physical racks and it had to be flexible enough to allow additional elements to be added while racks were off being worked upon. DataRacks had to very quickly create a custom solution.
In many data centres, the ‘Manhattan Skyline’ of different sized racks is a challenge that DataRacks is used to contending with. According to their managing director Jeremy Hartley, having all the legacy racks in the university’s data centre all of one height made a welcome change. The next issue was that all of the Ethernet switches are located in the rear of the racks. Hartley, explained: “With the hot aisle now running at 32C this would lead to possible overheating and reduced reliability of the switches. This is a frequent problem with aisle containment and one that we at DataRacks are used to solving. Fortunately, all the switches in use are cooled form side to side. So in double-quick time we designed and manufactured bespoke air-ducts that feed the cold air from the front of the rack to the input side of the switch.”
Another air-duct then takes the hot exhaust away from the switch and vents it into the hot aisle. As well as providing highly effective switch cooling, this also prevents the risk of hot air being vented into the centre of the rack and causing hot spots that would be impossible to effectively cool. This problem is not unique to UEA but is something that many data centres struggle to resolve easily.
Cooling
The Future Tech cooling solution uses a combination of direct fresh air delivery supplemented by the use of DX (direct expansion) coolers. As the temperature rises, the number of coolers on-line is increased under software control. This means that power requirements will be kept to a minimum while ensuring that cooling is kept within the target temperatures. The graduated scale of cooling will mean that at full IT load, of 220kW, while ambient temperatures are below 13C, only free air cooling will be used. Between 13C and 22oC, cooling will be a combination of free air with increasing amounts of DX cooling. Above 23C, the primary cooling will come from the DX units. This latter stage, which will take the most power is only expected to occur for around 300hrs each year (just 3.4 per cent of the year). Also while the system is operating at less than full load the free cooling temperature of 13C is increased. At the current IT load full free cooling can be provided up to 21C external ambient. This use of a mix of cooling is slowly becoming a popular solution as companies embrace free air cooling techniques but it does require a commitment to raising both input and output air temperatures in the data centre. Over the next six years at UEA, it is likely that the amount of time on DX cooling may well reduce as data centre standards and IT equipment evolve to work at ever higher input and exhaust temperatures.
To keep the mixed and full DX mode as efficient as possible, Future Tech have deployed EC (electronically commutated) fans which are far more efficient at part-speed than conventional fans. The fans have their speed adjusted under software control to ensure that only the required amount of air is delivered thus avoiding over cooling and the risk of condensation. The ability to drop into mixed mode and full DX mode immediately is also important. It means that should there be a sudden surge in loading on the servers, UEA will be able to counter any risk of thermal runaway because the systems are already interlinked. Other free air vendors require separate management systems for the free and forced air systems. A secondary backup option means that should there be any failure of the extract fans, the DX system will automatically drop into a full re-circulation stand-alone mode.
Challenge met!
Retrofitting data centre facilities in listed buildings where noise, power, cooling and energy efficiency targets are tough is a challenge that many would avoid. It can often be easier to look at a new facility, move to cloud or use a drop-in data centre.
For the University of East Anglia retrofitting and meeting their very strict corporate environment targets was critical. With Salix providing the funding and ensuring that the targets were contractually binding, it meant that Future-Tech as the lead contractor and DataRacks as the containment designer and manufacturer had to be very innovative and deliver in a short space of time. Confirming that the containment solution had met all of its design objectives and specifically the
1.19 PUE and temperature requirements, Paul Carter, Future-Tech Project Manager, said: “Dataracks were very flexible with both the detailed design and delivery of their aisle containment solution.
From initial survey to installation took only three weeks. Given the
time pressures on the project this was very helpful and exceeded
my expectations. The aisle containment systems is built to a
high standard and installed very well. I look forward to working
with them again in the future.”