5. The Internet of everything: The idea here is that we are building on pervasive computing where cameras, sensors, microphones, image recognition -- everything -- is now part of the environment. Remote sensing of everything from electricity to air conditioning use is now part of the network. In addition, increasingly intelligent devices create issues like privacy concerns. Eventually, IT will need some central unified management of all these devices, Cearley says.
6. Next-generation analytics: Most enterprises have reached the point in the improvement of performance and costs where Cearley says they can afford to perform analytics and simulation for every action taken in the business. Not only will data center systems be able to do this, but mobile devices will have access to data and enough capability to perform analytics themselves, potentially enabling use of optimization and simulation everywhere. Going forward, IT can focus on developing analytics that enable and track collaborative decision making.
7. Big data: Big data has quickly emerged as a significant challenge for IT leaders. The term only became popular in 2009. By February 2011, a Google search on "big data" yielded 2.9 million hits, and vendors now advertise their products as solutions to the big data challenge. The key thing enterprises have to realize is that they just can't store it all. There are new techniques to handle extreme data, such as Apache Hadoop, but companies will have to develop new skills to effectively use these technologies, Cearley says.
8. In-memory computing: We will see huge use of flash memory in consumer devices, entertainment devices, equipment, and other embedded IT systems. In addition, flash offers a new layer of the memory hierarchy in servers and client computers that has key advantages -- space, heat, performance, and ruggedness among them. Unlike RAM, the main memory in servers and PCs, flash memory is persistent even when power is removed. In that way, it looks more like disk drives where we place information that must survive power-downs and reboots, yet it has much of the speed of memory, far faster than a disk drive. As lower-cost -- and lower-quality -- flash is used in the data center, software that can optimize the use of flash and minimize the endurance cycles becomes critical. Users and IT providers should look at in-memory computing as a long-term technology trend that could have a disruptive impact comparable to that of cloud computing, Cearley says.
9. Extreme low-energy servers: What if you could turn 10 virtual machines in one box into 40 slow physical servers that are tiny and use very low amounts of energy? There is a call for this type of computing to handle big data. For example, thousands of these little processors could work on a Hadoop process, Cearley says. Gartner says that 10 to 15 percent of enterprise workloads are good for this. Moving the application from 10 images to 40 slower, less capable machines will only deliver on that promise if the software will perform the same. Server technologies are going to change to handle big data.
10. Cloud computing: This topic went from No. 1 last year to No. 10 this year, but it's still an important trend. It will become the next-generation battleground for the likes of Google and Amazon. Going forward, enterprise IT will be concerned with developing hybrid private/public cloud apps, improving security and governance, Cearley says.
Read more about infrastructure management in Network World's Infrastructure Management section.