With eternally increasing data sets and heavier application requirements, IT needs to be able to move faster than ever to deploy and manage larger infrastructures.
Luckily, along with these growing requirements, we also have the ability to automate many of these tasks in ways we could not just a short time ago.
Jamie Thomas is general manager of software-defined systems at IBM. In this week's New Tech Forum, Thomas takes us on a tour of a software-defined environment that can move in real time, easing the burden on IT and speeding the delivery of business-critical applications. -- Paul Venezia
Building a software-defined environment
The consumerization of IT poses unprecedented challenges to organizations. Mobile and social media, along with the large amounts of data they generate, have placed increased pressure on IT. In particular, the pace and transient shelf life of data have increased attention on the agility and optimization of IT resources.
At the same time, infrastructure within data centers has become increasingly heterogeneous and complex to manage over the last decade. Many tasks related to deploying a new solution or application are still done manually, in some cases adding weeks to schedules. On top of that, IT teams often perform their tasks with little knowledge or understanding of what data center resources an application requires.
To become effective in this new, data-driven world, organizations must move to the next generation of automation, one that understands the requirements of business applications and responds to those requirements in real time -- in other words, a software-defined environment. In a software-defined environment, IT becomes simplified through open standards, as well as responsive to shifting requirements and adaptive through policy-based automation. Such an environment is developed by automating infrastructure across server compute, storage, and networking resources.
The value of abstracted infrastructure
Many organizations develop business applications using processes that are time-consuming and labor-intensive. It all starts when developers work with business unit experts to create new applications. When these applications are deployed into production, the infrastructure on which they run is controlled by IT operation specialists who optimize for a particular resource -- such as compute or storage. These domain specialists frequently have very limited understanding of business application requirements.
This old-fashioned approach often leads to inefficient, underutilized infrastructure and diminishes the ability of IT to respond to business needs effectively.
The goal of a software-defined environment is to enable business users to describe their expectations of IT in a systematic way, which in turn drives automation of the infrastructure. The infrastructure understands application's needs through defined policies that control the configuration of compute, storage, and networking, and it optimizes application execution. Through this approach, organizations are able to respond in real time to provide improved availability, as well as support for shifting volumes of work.
For example, in an application such as fraud analytics, spikes often occur when processing large amounts of data. The data frequently includes unstructured data from social sources, as well as transactional history. A software-defined environment enables the business to allocate compute and storage resources automatically to meet peak demand and to prevent degradation in performance.
Three steps to a software-defined environment
A software-defined environment can't be built in a day. Organizations must develop the architecture over time, step by step. The three most important steps are mandating an open approach to virtualization, creating policies to optimize the infrastructure, and enabling the elastic scale of data.
1. Open virtualization. Opening up hardware capabilities through defined APIs that integrate into open frameworks such as OpenStack is the first step toward building an agile, responsive, and flexible IT infrastructure. A software-defined environment starts with a virtualized data center that includes compute, storage, and networking resources built on open interfaces and an integrated framework. Open interfaces increase the speed of domain integration, break down silos of expertise, and offer organizations choice. Building software-defined offerings based on open standards enables choice, flexibility, and interoperability across the data center.
2. Policy optimization and elastic scaling. Organizations need to enhance infrastructure automation with a policy manager that ensures adherence to ongoing service-level agreements -- and responds to changing workload demands in real time. Organizations also need to have extensive capability to automate resources at the compute layer and integrate this optimization with the storage layer. In the storage arena, organizations need the ability to store and share large amounts of structured and unstructured data across their data centers quickly, reliably, and efficiently. A high-performance enterprise file management platform that includes a clustered file system brings together the power of multiple file servers and multiple storage controllers to provide increased reliability and performance.
3. Application-aware infrastructure. Once organizations have established an elastic, scalable infrastructure, they can start applying new methods to define their workloads in terms of components (such as application servers and databases) and infrastructure (firewalls, virtual machines, and storage). They can also define the policies that govern deployment and optimization of these resources. Products that use best practices to create these patterns and automate these manual processes exist in the market today, and additional development is under way to extend more freedom of action to developers and better manage the lifecycle of defined patterns.
Serving the needs of modern business
The ultimate goal of the software-defined environment is to yield an application-aware infrastructure that captures workload requirements and deployment best practices, provides policy-based automation across data center environments, and includes analytics to optimize in real time.
In an era of rapid change and increasing competition, organizations need a smarter infrastructure that's agile and flexible in order to be successful. Intelligent resource scheduling, elastic data scaling, and automation of best practices are crucial to moving from static, legacy infrastructure to application-aware infrastructure. Once built, a software-defined environment simplifies business operations, makes businesses more responsive to market changes, and maximizes business outcomes.
New Tech Forum provides a means to explore and discuss emerging enterprise technology in unprecedented depth and breadth. The selection is subjective, based on our pick of the technologies we believe to be important and of greatest interest to InfoWorld readers. InfoWorld does not accept marketing collateral for publication and reserves the right to edit all contributed content. Send all enquiries to email@example.com.