The story behind Software Defined Storage

By David Ellis, CTO and Senior Director Services – EMEA at Arrow ECS EMEA.

  • 9 years ago Posted in

It’s no secret that both structured and unstructured data is growing at an immense rate. Data, in some way or another, is part of our every day life. Although we’re aware that it’s growing exponentially, many aren’t aware of just how vast the volumes are. Add the rise of social media, mobile - and now Machine to Machine (M2M) and the Internet of Things (IoT) - and the amount of data available is only set to continue to increase exponentially.

But where’s it all going to go? After all, data needs to be stored somewhere.

This data tsunami is continuing to put pressure on the data centre and with no signs of abating any time soon, organisations need to look at the best options available for keeping their valuable information housed, managed, maintained and protected. It’s vital, therefore, that data centres are built so that they can scale and grow in-line with this anticipated usage, particularly if they’re to provide the highest quality of service through an ‘always on’ environment.

As the data evolves, so must the solutions. One of these solutions is the Software Defined Data Centre (SDDC), which can provide service providers and enterprises with improved agility, automation and flexibility for its users. This, in turn, will drive cost reductions – so a win-win situation for all involved.

The SDDC has, in fact, been discussed for several years, albeit in theory. Primarily because implementing it is considered a complex and potentially onerous task. Making the transition from a physical data centre to a SDDC means thinking about the underlying physical infrastructure. However, it needn’t be a drastic and radical transition, rather something that can be implemented in a step-by-step approach to address the changing needs of the business or organisation as they occur.

The first step for data centre managers in making the transition to a SDDC will be to consider whether their networks are able to handle the large increase in bandwidth requirements caused by this data explosion. They’ll also need to think about how their server technologies will deal with the analytics of such vast amounts of data, which will surely demand high compute power. As more data is created, storage capacity will need to be increased alongside backup and archiving provisions. To help with this, technologies like flash and hybrid storage are becoming far more widely deployed to ensure performance requirements are met.

As the demand for compute power fluctuates, in response to changes in the business environment, it can become impossible or uneconomical to manage it all in a central location. Co-location is an option to consider in order to aggregate and process the data, but operating a distributed model (central and remote aggregation data centres) can create management challenges for staff working in the data centre. This only strengthens the case for an automated, software-defined infrastructure. The beauty of the SDDC is that it allows many functions to be automated, therefore reducing the data centre management overheads.

SDDCs will also help companies drive more value from the data through business analytics and intelligence. This is particularly important with the rise in unstructured data from social applications and networks such as Facebook, LinkedIn and Twitter. The use of predictive analytics, in particular, will help companies derive real ROI from these new technologies.

Another important aspect to bear in mind with SDDC is security. Many organisations are moving to all-wireless workplaces, delivering tailored content dependent upon location and user. With factors such as the increased ‘east-west’ traffic in the virtualised data centre and the sharp increase in the number of connected devices, security risks will only continue to increase.

The changes may also lead more organisations to adopt a hybrid cloud strategy and place less critical applications into the public cloud.

With an explosive amount of data expected to be created in the 3rd Platform era and even greater demands being put on IT managers and data centres, now is the time to review current infrastructures and take the first steps in tackling the inevitable future of unstructured data.

La Molisana, a leading Italian pasta company, selects Hitachi Vantara’s Virtual Storage Platform...
Cerabyte, the pioneering leader in ceramic-based data storage technology, has been awarded a highly...
Innovations for large-scale deployments focused on flexibility, operational efficiency, resilience,...
New study by Splunk shows that a significant number of UK CISOs are stressed, tired, and aren’t...
PagerDuty has released a study that reveals service disruptions remain a critical concern for IT...
NVIDIA continues to dominate the AI hardware market: powering over 2x the enterprise AI deployments...
Hitachi Vantara survey finds data demands to triple by 2026, highlighting critical role of data...
ELTEX, Inc., a pioneer in the e-commerce industry in Japan, has modernised its storage...