Big data needs software-defined storage

The proliferation of mobile devices and instrumented enterprise assets is igniting a new data explosion, from which can be gleaned new analytic insights and, in turn, open new business opportunities. At the same time, big data is placing greater demand on existing infrastructures, driving a need for instant access to resources — compute, storage, and network — and creating a new imperative to adopt cloud technologies. The flexibility required simply can’t be obtained with a traditional hardware-centric approach.

Storage is a constant pain point for cloud deployments, but it has been largely ignored by IT organizations, which have focused their attention primarily on server and network virtualization. With capacity growth, application performance, and cloud-related issues challenging organizations, IT managers must improve storage efficiency with not just virtualization of their server infrastructure but also their storage environments.

According to a 2013 study conducted by EMEA research, storage provisioning and management is a significant bottleneck for 58 percent of enterprise cloud deployments. As a result, storage automation was identified as the top integration requirement for the initial release of cloud projects, as cited by 32 percent of organizations.

For those respondents who had already attempted to deploy a private cloud without an SDS (software-defined storage) infrastructure, an overwhelming 84 percent of them were planning some sort of hardware-independent storage virtualization system to support their cloud.

Leave a Comment

Your email address will not be published.

You may also like

Crayon Yoda

Pin It on Pinterest