Future proofing for big data

The UK IT market is already worth an exceptional €80 billion and according to forecasts by Forrester Research it will grow by 4.6 percent next year. The research showed that growth will in part be due to the importance of data mining and the increased investment in back-end technology to support this ongoing growth. By James Taylor, Level 3.

“Big data” has become the term de jour, but it is much more than just a passing fad. All over the world, and in multiple sectors, companies are generating, collecting and analysing vast data-sets which legacy infrastructure may struggle to process. Everything from shopping patterns to police arrests now create a plethora of data and metadata which can be analysed to spot patterns and, most importantly, improve efficiency.
One sector which has recently discovered the benefits of big data is gaming. Today, every decision a gamer makes in an online game can be tracked and used to personalise the experience to improve in-game ads and content.


This means that capturing, storing and transferring information to different sites in a secure manner can become difficult as the data sets involved are massive and the information often needs to be accessed in real time.


There is little doubt there will be increased pressure on infrastructure and it’s here that the data centre can take the strain. As the workhorse of the value-chain, resilient and high capacity data centres need to be at the heart of any organisation’s big data strategy. Having networks and servers which can scale in the long term is absolutely crucial.


Future proofing
So what steps should organisations take to ensure their data centre strategy is keeping up with this trend?


The first stage is evaluation and auditing. It is crucial for a business to look at its existing infrastructure and ascertain whether it is capable of meeting its businesses goals for the use of big data. Forecasts for data growth should also be mapped against available budgets to help answer the crucial question of whether to take the self-build, or outsourced approach.


The case for outsourcing is attractive in the face of growing volumes of data. Quite simply, outsourced providers have the specialist focus needed to handle the vast resources required. This not only yields technical benefits, but also financial ones as it avoids the necessity for capital expenditure.
Using an outsourced provider can also provide the necessary flexibility to help address the innate need for scalability relating to space, network and power capabilities.


Big data is still developing as a business tool and, as with any relatively early stage technology trend, nothing is for certain so it is crucial to ensure there is no element of infrastructure “lock-in.”


Data centre with global connectivity
Another important consideration is location. If big data is to be central to the way a company operates, then having an element of real-time access is vital to maximising the value of the information. This is especially true in industries such as finance, where split seconds truly make a difference to the bottom line.


For this reason, many organisations chose a data centre which is connected to a global network that reduces latency by decreasing the number of “hops” across different fibre routes. Having a direct global network connection into a data centre also reduces the amount of third party infrastructure used, decreasing risk, and providing a single point of contact in the event of an outage.


Lastly, businesses must consider the sensitivity of data being stored and shared by employees. There is a simple equation here – more data equals more risk.


The traditional concept of a business asset is gradually changing and becoming more aligned with the data owned by a company, causing the potential cost of a data breach to increase. For this reason, increased security in every single stage of the big data collation, transfer and storage process is vital. Again, this is another situation where consolidating IT suppliers reduces the number of risk points.


Any company looking to address the infrastructure issues presented by this brave new world of growing volumes of data would do well to ensure that all of these elements are in place.


The importance organisations are placing on big data has put the data centre in the spotlight. With the support of a flexible data centre provider, who can provide scalability without sacrificing quality of service, technical teams can rise to this challenge and truly assert their value to the wider business.
James Taylor is Director of Cloud Services, EMEA, Level 3 Communications.


Source: http://www.forrester.com/Continued+Gloom+For+European+ICT+Markets/fulltext/-/E-RES98001?isTurnHighlighting=false&highlightTerm=80%20billion
-ENDS-

First of its kind research, in partnership with Canalys, offers deep insights into some of the...
According to a recently published report from Dell’Oro Group, worldwide data center capex is...
Managed service providers (MSPs) are increasing their spending by as much as 70% to meet growing...
Coromatic, part of the E.ON group and the leading provider of robust critical infrastructure...
Datto’s Global State of the MSP: Trends and Forecasts for 2024 underscores the importance of...
Park Place Technologies has appointed Ian Anderson as Senior Director, Channel Sales, EMEA.
Node4 has passed the ISO 27017 and ISO 27018 audits, reinforcing its dedication to data security,...
Park Place Technologies has acquired Xuper Limited, an IT solutions provider based in Derby, UK.