The BYOD era may be ending already

In 2010, the bring-your-own-device notion was derided; in 2012, it became normal; in 2014, it may be seen as an odd exception

Few technology trends have forced their way into business as fast as the bring-your-own-device (BYOD) movement, which started weakly in 2007 with the release of the iPhone but gathered momentum in mid-2010 when Apple added corporate-class management and security features in iOS 4. A year later, IT's "no way will user devices get into my network" policies were in tatters, with a majority of companies having accepted BYOD for at least a portion of the employee base and for iOS devices.

The iPad sealed the deal, as companies saw direct benefit to adopting them, and enabling such adoption automatically brought iPhones along, as their OSes and management opportunities are identical. Motorola Mobility and Samsung adopted similar management capabilities in some of their Android devices, giving IT an acceptable -- if not desired -- level of manageability that meant it could no longer say no to all those execs sporting mobile devices they picked up from the Apple Store, Best Buy, or

[ Whatever BYOD's future, it's already part of IT's present. Here's how to steer clear of BYOD boondoggles. | Understand how to both manage and benefit from the consumerization-of-IT trend with InfoWorld's "Consumerization Digital Spotlight" PDF special report. | Subscribe to InfoWorld's Consumerization of IT newsletter today. ]

The future, it seems in spring 2012, is one where employees will acquire their own technology either directly (BYOD) or by forcing their companies to provide a selection of what users want (choose your own device, or CYOD). With the shift to user-driven technology in mobile, it's now common to hear business and IT execs talk about letting employees bring or choose their own personal computers in a few years: Windows 7, Windows 8, or OS X on a variety of vendors' hardware and in a variety of configurations, from tablets to desktops.

Or maybe not.

Tablets hold the key for BYOD's impending obsolescence
In a recent conversation, Phil Asmundson, Deloitte's vice chairman and U.S. media and telecommunications sector leader, suggested that the BYOD phenomenon will be short-lived. His reasoning: not because it's a bad idea but because the technology that makes BYOD possible will also make it unnecessary.

He may have a point.

One of the major reasons for employees' insistence on BYOD is that they want to choose the tools they like, which are not the ones their companies issue. We all know the (accurate) stereotype: Companies issue BlackBerrys and Windows XP PCs, and users want iPhones or Androids and Macs or Windows 7 PCs. They have those devices at home, where they also do some work, so why not formalize that reality? For PCs, that's not so much a BYOD phenomenon as it is a CYOD phenomenon. But for mobile devices, it's more of a BYOD phenomenon because people don't want to carry two devices and would rather use the one they prefer.

Whether it's BYOD or CYOD, the device is increasingly dual-purpose. That's where the technology comes in.

Asmundson notes, "We see a large percentage of people leaving the laptop at home." They bring a tablet with them instead when they travel.

I know what he means; when I first started using a tablet, I brought both a laptop and a tablet with me. The tablet was my entertainment and email device in transit, but my laptop was what I worked on at the hotel and remote sites. But soon I began leaving the laptop at home for trips of just a few days, as I found I could do most of my work on the tablet. Given the limited luggage capacity allowed air passengers these days, I wanted to carry only what was essential. Now, I leave the laptop at home for trips of a week or longer. I rarely need a laptop any more, truth be told; a small PC at my desk and a tablet (say, a Mac Mini and an iPad) would suffice 95 percent of the time.

1 2 Page 1
Page 1 of 2
How to choose a low-code development platform