Year of change lays the foundation for year of development within the data centre

2016 was a rollercoaster of a year with new technologies and upsets causing stirs within the market. The year saw the rise of virtual and augmented reality (VR) (AR) into the mainstream consumer and business market, Brexit and the knock-on impact this has had on business confidence and the ever-growing technologies underpinning the data centre. But what does 2017 hold and how can data centre managers and technology businesses capitalise on these changes? Colocation provider Aegis Data’s CEO, Greg McCulloch takes a look.

Virtual and augmented reality really came into the mainstream in 2016 with the global success of Pok?mon GO. Its rise wasn’t a smooth journey with delays in the UK due to server overloads causing great angst amongst consumers. The hype surrounding the app opened a whole new market for data centre providers realising the vast opportunities behind the latest mobile craze. Operators with sufficient server capabilities, adequate cooling and high enough power capabilities are in an ideal position to enable game users and developers to provide increasingly sophisticated, data hungry games. Next year is likely to see a slew of new VR and AR games that will be just as data hungry, or more so as the technology develops. Data centre operators that are able to offer substantial power capabilities are set to benefit the most, thanks to their reliability in ensuring consumers are not left with slow games.
 
“We saw VR and AR hit the mainstream in 2016 after years of being a fringe technology. 2017 will see the consolidation of this technology and the growth in its availability and detail. The data centre backing these up must be able to take high loads or it will fail, causing the games developers to seek support elsewhere,” says McCulloch.
 
Brexit caused uncertainty the world over, politically, economically and socially. While the markets have now levelled, the uncertainty around the negotiating positions of the UK and the EU are causing hold ups in several areas. For the data centre industry, however, the impact to date has been limited. Large technology companies have continued to invest in the UK data centre market. Just recently, IBM confirmed it would build four new data centres in the UK, along with Amazon’s commitment to the sector and Equinix’s ?26million expansion of existing sites. Smaller providers are more likely to feel the pinch when Article 50 is finally triggered sometime in the new year and the real negotiations begin, but until the politicians play their hands, much is still unknown – for now it is very much business as normal.
 
Data centre 2.0 has been a talking point in 2016 with issues such as high performance computing (HPC) the Open Compute Project (OCP), hyperconverged infrastructure and software defined data centres (SDDC) featuring more heavily amongst providers’ offerings. The desire to provide faster, more efficient centres to power the data hungry technologies that are coming through are promoting changes to the traditional structure and design of facilities. By offering HPC, data centre operators can provide high power levels typically reserved for research facilities and government organisations, enabling industries such as the gaming sector to be more confident in a sites ability to provide secure power during high demand.
 
The availability of open source hardware, without the legacy constraints of typical data centre hardware, courtesy of the OPC is enabling data centres to be more efficiently designed. Access to customised server racks that can support higher densities typically seen in HPC helps data centre operators to attract the latest technologies, such as VR and AR developers.
 
Hardware such as hyperconverged infrastructure is also continuing its rise in the data centre. By combining storage, network and compute infrastructures, typically taking up vast amounts of room within the server rack, data centres can condense their footprints into more manageable, efficient systems. This impacts not only the space within the data centre, but also the cooling requirements and power levels being offered.
 
Software defined data centres are placing more emphasis on a sites’ capacity to develop DCIM solutions to manage the data centre 2.0. A site offering some of the new technologies discussed above needs a robust DCIM offering in place to effectively manage and control all elements of the data centre. As more technologies, whether it be VR, AR, cloud computing or social media, come to rely more heavily on the data centre, effective infrastructure management is vital.
 
“Existing technologies like cloud computing and social media are going to continue as they do every year, but will become more reliant on the data centre to support them in their growth. Over 200 million people joined social media platforms between 2015-2016, there is no reason this is going to stop in 2017 and data centres must be able to provide for this,” said McCulloch. “2017 will see the consolidation of the external technologies deployed in 2016 but also the growth of internal technologies that are resulting in a smarter, more versatile data centre that is being encapsulated in 2.0”.
By Brian Sibley, Solutions Architect, Espria.
By Lori MacVittie, F5 Distinguished Engineer.
By Adam Gaca, Vice President of Cloud Solutions at Future Processing.
By Jo Debecker, Managing Partner and Global Head of Wipro FullStride Cloud.
By Charles Custer, senior technology researcher, Cockroach Labs.
By Tim Whiteley, Co-Founder of Inevidesk.
By Russell Crowley, co-founder at Principle Networks.