This often makes in-house data centres expensive and are often in buildings not designed to support IT infrastructure.
An independent study commissioned by Volta Data Centres found that colocation has become an increasingly important component of IT models, with budgets for colocation increasing for 75% of organisations over the past five years.
However, the report also showed a serious amount of downtime for businesses relying on their data centre, with more than half (56%) of IT decision makers surveyed saying they had experienced some sort of IT outage within the past 12 months.
So, with more companies planning to outsource their IT assets, how can they be safe in the knowledge that business won’t be affected by downtime?
Data centre downtime damaging business
Data loss knows no boundaries – it can impact anyone at any time and have a ripple effect on other parts of the business.
Businesses that provide critical services, such as those involved in healthcare, banking and managed service providers, cannot afford a single second of downtime – anything more could be catastrophic.
But lost files can take hours or even days to recover, leading to a severe drop in productivity. Then there’s the damage to reputation as the negative impact of downtime starts to effect customers. And last but not least the financial aspects as it starts to affect sales.
So, it’s concerning to see that so many businesses have experienced downtime from their data centre over the past year.
In the ‘Data loss and downtime are putting hybrid and edge computing at risk’ report, the average loss of time for each respondent was around three hours and 45 minutes – just in the past six months. And, of those that had experienced downtime, 8% have experienced more than 10 hours’ worth of IT outages.
More worryingly, the effect of data centre downtime on security was a common trend in the report. Just under half (46%) have suffered data loss in the last 12 months as a result of their data centre letting them down, with 4% have experienced data loss more than five times during the same period.
In the aftermath of GDPR and a heightened awareness around data loss and data breaches, downtime is simply a risk businesses cannot afford to take.
Hybrid to the rescue
In the Uptime’s Institute’s eighth annual data centre survey, 61% of respondents said adopting a hybrid IT consumption model – spreading workloads across a variety of environments such as on-premise, at colocation facilities or in the cloud – had made their IT more resilient overall.
Hybrid solutions are not only key for eliminating the risk created with that ‘single point of failure’, they also ensure uptime for physical backups while providing low-cost, high-performance access to clouds, networks and other providers that businesses may need to leverage.
The hybrid model is especially important when considering that data centre infrastructure will only become more complex in the near future as operators start to provide more services such as edge computing – with the growing need for businesses to process data closer to where it is generated and closer to the end-user.
Questions to ask
Clearly both outages and data loss will be due to a variety of factors, from network glitches to human error, failed UPSs to inadequate maintenance. But whatever the reason, organisations need to be taking a far more robust approach to data centre due diligence.
Where is the guarantee of 100% uptime? What power resilience is in place? How many different connectivity options are available – and do they run across different networks for greater contingency? These are the questions to ask to make sure data centre downtime doesn’t affect your business.