Big data needs software-defined storage

With demands for agility and capacity, storage systems can't be islands. IBM's Ronald Riffe explains how software-defined storage provides a broad, hardware-independent solution

Page 2 of 2

The value of SDS in the real world

Consider this cloud and virtual infrastructure example: A financial services company is beginning to build a private cloud with software-defined infrastructure. It has virtualized its compute infrastructure and is enabling network and storage virtualization in the next phase. Its core business applications for credit card processing and virtual desktop run in a Dallas data center and a big data credit-risk application along with an application development cloud is running across the Beijing and Sao Paulo data centers.

The credit risk application for such an organization is a big data application that must process live data from various sources: credit card transaction data, credit score bureau data, personal customer data, Twitter feeds, and other publicly available social media data. The data-intensive credit risk application is running in the Beijing cloud and starts experiencing an I/O bottleneck. A good SDS infrastructure detects this issue and, through a policy shift to the storage layer, automatically provisions new flash storage. The SDS system then automatically shifts the relevant "hot" data to this new flash storage, improving I/O throughput, and maintaining the service-level agreement.

With SDS, data can be dynamically moved and seamlessly shared, storage capacity can be elastically scaled, and new performance tiers can be transparently introduced. In our example, the credit risk application overran the physical infrastructure it had been assigned. The SDS system responded, automatically provisioning new resources and moving the data as needed.

When SDS is done right, key benefits emerge:

  • SDS automates the use of on-premise and on-cloud storage resources
  • Policy-based orchestration of storage resources optimizes performance and efficiency
  • Analytics-driven optimization of software-defined resources can meet unpredictable business needs
  • Building on open APIs, tools, and technologies maximizes customer value, skills availability, and easy reuse across hybrid cloud environments

Vendors are already competing to claim the SDS space, each with their own approach. For example, IBM Virtual Storage Center enables automated, policy-driven storage tiering and virtualization of heterogeneous storage systems and can turn existing storage into private cloud storage without "rip and replace."

A vital new component of the software-defined data center, SDS is still evolving -- and will continue doing so at a rapid pace. According to a 2013 IDC report, "Software-based storage will slowly but surely become a dominant part of every data center, either as a component of a software-defined data center or simply as a means to store data more efficiently and cost-effectively compared with traditional storage."

The right SDS solution is one that helps clients transition from traditional storage infrastructure to a more agile, cloud-ready, software-defined environment and manage it effectively. Deployed correctly, SDS simplifies and modernizes heterogeneous storage environments, uses analytic-driven data management to reduce the cost of storage, and standardizes advanced data protection capabilities across storage systems.

New Tech Forum provides a means to explore and discuss emerging enterprise technology in unprecedented depth and breadth. The selection is subjective, based on our pick of the technologies we believe to be important and of greatest interest to InfoWorld readers. InfoWorld does not accept marketing collateral for publication and reserves the right to edit all contributed content. Send all enquiries to

This article, "Big data needs software-defined storage," was originally published at For the latest business technology news, follow on Twitter.

| 1 2 Page 2