With eternally increasing data sets and heavier application requirements, IT needs to be able to move faster than ever to deploy and manage larger infrastructures.
Luckily, along with these growing requirements, we also have the ability to automate many of these tasks in ways we could not just a short time ago.
Jamie Thomas is general manager of software-defined systems at IBM. In this week's New Tech Forum, Thomas takes us on a tour of a software-defined environment that can move in real time, easing the burden on IT and speeding the delivery of business-critical applications. -- Paul Venezia
Building a software-defined environment
The consumerization of IT poses unprecedented challenges to organizations. Mobile and social media, along with the large amounts of data they generate, have placed increased pressure on IT. In particular, the pace and transient shelf life of data have increased attention on the agility and optimization of IT resources.
At the same time, infrastructure within data centers has become increasingly heterogeneous and complex to manage over the last decade. Many tasks related to deploying a new solution or application are still done manually, in some cases adding weeks to schedules. On top of that, IT teams often perform their tasks with little knowledge or understanding of what data center resources an application requires.
To become effective in this new, data-driven world, organizations must move to the next generation of automation, one that understands the requirements of business applications and responds to those requirements in real time -- in other words, a software-defined environment. In a software-defined environment, IT becomes simplified through open standards, as well as responsive to shifting requirements and adaptive through policy-based automation. Such an environment is developed by automating infrastructure across server compute, storage, and networking resources.
The value of abstracted infrastructure
Many organizations develop business applications using processes that are time-consuming and labor-intensive. It all starts when developers work with business unit experts to create new applications. When these applications are deployed into production, the infrastructure on which they run is controlled by IT operation specialists who optimize for a particular resource -- such as compute or storage. These domain specialists frequently have very limited understanding of business application requirements.
This old-fashioned approach often leads to inefficient, underutilized infrastructure and diminishes the ability of IT to respond to business needs effectively.