The IT paradox: A diminished role in technology, but greater clout in the business

As technology becomes easier to use, it becomes more complex internally, and IT is less necessary in some ways, yet more essential in others

There is a paradox in the technology that IT employs and deploys. As it becomes easier to use and simpler to manage, it is actually increasing in complexity. And there is a paradox within this paradox concerning how IT relates to the business. More on that in a bit.

The simple/complex paradox is nothing new. Think back to the days of DOS, with its command-line user interface. You couldn't just sit down and use a PC running DOS without a good deal of training first. What was that blinking cursor demanding of you? PCs were relatively difficult to use, but the underlying operating system was relatively straightforward. At about the same time, Apple was selling computers with a graphical user interface that made their use much more intuitive. But the Macintosh operating system was several degrees more complex than DOS.

[ Also on InfoWorld: Bob Lewis explains how to become the new business analyst needed in IT. | Get sage advice on IT careers and management from Bob Lewis in InfoWorld's Advice Line blog and newsletter. ]

Eventually, DOS gave way to Windows, and the race to make computing easier was on. But with every improvement that made things easier for users, the internal code became more complex.

Since then, the divergence between what users experience and what was required for them to experience it has grown ever larger. Today, it manifests itself in several disruptive technologies, including cloud computing. For the typical end-user, cloud-based services such as Salesforce.com, Apple's iCloud, and Google's Gmail are easy-to-use, intuitive, flexible and cost-effective. Internally, however, a number of innovations and technical advances are required to support the inherent multitenancy requirements of cloud-based applications over traditional on-premise applications. Basically, the cloud-based applications need more logic in order to support multiple users and multiple organizations and to serve up their personalized content while protecting and partitioning their data.

Beyond the software itself, the same thing is happening in the broader cloud infrastructure layers. Where once IT had two deployment options -- in-house data centers or IT outsourcing -- it now faces five different deployment models: traditional on-premise, public cloud, private clouds (both internal and hosted), and traditional IT outsourcing. What's more, these five models are ideally managed so as to optimize which applications run on which environment based on parameters such as reliability, availability, and security, as well as cost-effectiveness and flexibility.

Smartphones and tablets are another example. In terms of ease-of-use, today's devices are vastly superior to their predecessors of just a few years ago. Internally, of course, their processing power and built-in features and functions have grown increasingly more powerful and complex. For example, the Apple A5X processor inside the third-generation iPad features a dual-core CPU running at 1GHz and consists of a system-on-a-chip that combines a low-power CPU, a graphics processing unit, and other hardware.

Meanwhile, software development has witnessed transformational change, from two-tiered client/server architectures to three-tiered Web architectures and today's development paradigm, where applications are expected to be collaborative, location-aware and on-demand, so must incorporate interfaces for social, mobile, and cloud extensions.

1 2 Page
Mobile Security Insider: iOS vs. Android vs. BlackBerry vs. Windows Phone
Recommended
Join the discussion
Be the first to comment on this article. Our Commenting Policies