But until all apps are designed to support touch gestures, and the OS makes more use of them (as the iPhone OS does), it's simply easier to stick with the mouse because you know it works everywhere.
Issue 2: PC UIs aren't finger-friendly
In using a Dell Studio One desktop and an HP TouchSmart desktop -- whose touchscreens based on NextWindows' technology are quite responsive -- I found another limitation to the adoption of touch technology in its current guide: The Windows UI really isn't touch-friendly. A finger is a lot bigger than a mouse or pen, so it's not as adept at making fine movements.
Also, on a touchscreen, your hand and arm obscure your view of where your fingertip actually is, making it hard to actually touch the intended radio button, close box, slider, or what-have-you. It doesn't help that these elements are often small. And there's no tactile feel to substitute for the lost visual feedback.
But the issues of using touch gestures go beyond the visibility and size of UI controls. The ways the controls work is often not finger-friendly. Take as an example Windows 7's wireless LAN setup. It has some big buttons to select a desired network, so it's natural to just press the desired one. And sometimes that works, but often these visual buttons are really the equivalent of radio buttons -- item selectors -- and you then have to tap the Next button. That's not the kind of direct stimulation that touch assumes. When you work with something on your hands, the manipulation is direct. But most apps are designed for interaction with keyboards and mice, and aren't so direct (to prevent accidental selections and the like, since it's really easy to move a mouse unintentionally).
The result is that using touch is often an awkward process. Unlike an iPhone's apps, Windows or Mac OS X apps weren't designed for touch, and neither the OSes nor the apps are intended to adjust themselves for this input method.
Issue 3: Gesture-based computing needs a better surface
I was surprised to discover a third issue: the touch surface itself. I love using the touchscreen on my iPod Touch, but I usually did not like using the touchscreens on the Dell or HP.
The issue wasn't the screen per se, but its location. A monitor is in front of you, a good foot or two away. That means holding your hand and arm out, raised and extended. That's not comfortable for long durations. Try this: Move your mouse under your monitor, then see how long you can stand it. It also means a lot of ungainly arm movement to get to the keyboard, which few apps let you ignore. (Windows 7 does have a handwriting app that is OK to write with, but impossible to edit with. And writing more than a dozen words at a time is likely to make your fingers and arm hurt.)
There's also the issue of the parallax effect: The layer of glass above the LCD's crystals creates a slight gap between what you touch and what you see. At the edges of a screen, the distance is enough to throw off your hand-eye coordination -- a reason that so many iPhone users have trouble typing on the virtual keyboard's side keys such as Q, A, P, and L. Over time, your brain adjusts, of course.
The Mac OS's reliance on a trackpad for touch input lessens these issues. Your hand is at a more natural location -- on a trackpad, not the screen, so you can follow the mouse pointer easily to see where you are -- like using a mouse. And you can easily switch to the keyboard and even a second input device such as a pen or mouse. (Adesso does make a touchpad for PCs, but it doesn't use the Windows 7 gestures, relying instead on its own. I could not find any external touchpads for Macs.)