One of the most interesting user interface hardware advances of the last few years is multitouch input, which can come in a pad, tablet, or screen. Multitouch screens, manipulations, gestures, and flicks are crucial to the appeal of the iPhone, iPod Touch, and MacBook Pro user interfaces, as well as for Microsoft Surface tables and boards.
For most of us, iPhones are commonplace; Microsoft Surface devices, unfortunately, are still more familiar from TV and demos than everyday life. For Windows users, multitouch will become much more prevalent when computers designed for Windows 7 ship this fall. For Windows developers, now is the time to start supporting touch and multitouch.
[ InfoWorld's Randall Kennedy has also taken a look at the Windows UX Guide; see what he's found | Keep up with app dev issues and trends with InfoWorld's Fatal Exception and Strategic Developer. ]
Microsoft published a long article on touch support as part of its Windows User Experience Interaction Guidelines (UX Guide). If you're serious about supporting touch in your Windows applications, you should go through the whole article carefully; nevertheless, I've pulled out a few nuggets to whet your interest.
A gesture is a quick movement of one or more fingers that the computer interprets as a command, as opposed to a mouse movement, handwriting, or drawing. Gestures include pans, zooms, rotations, one- and two-finger taps, and flicks. Flicks are quick directional gestures for navigation (horizontal and vertical on Windows) and editing (diagonal on Windows). Manipulations grab objects on the screen and move them around directly, much as you'd expect them to move in the real world.
One of the key tenets of design for touch is to make your controls big enough to select easily with a fingertip. That means they have to be at least 6mm by 6mm, and preferably 10mm by 10mm for commonly used controls. On Windows with a 96-dpi screen, that means at least 23 by 23 pixels or 13 by 13 dialog units.
That's part of the reason that Microsoft Office has big buttons and drop-downs on its ribbon for the most frequently used functions:
One of the easiest ways to get yourself ready for touch support is to add an inexpensive tablet to your development computer. I have a small Genius tablet; it sits to the right of my keyboard. I use it as a mouse pad the majority of the time, but when I want to use flicks and ink, I move the mouse to the back of my desk and grab the tablet's pen from its holder. I do not use the menu template software that came with the tablet, because it disables flicks.
An even better but more expensive way would be to buy a Tablet PC with multitouch. Both HP and Dell offer multitouch tablets: HP's is the TouchSmart tx2, and Dell's is the Latitude XT2.
I'll leave you with a few more of Microsoft's touch UX guidelines:
- Prefer constrained selections (such as drop-down boxes) to interfaces that require text input.
- You can make users more productive by automatically zooming input UI up to 150 percent by default when touch is used for text editing
- Don't assume that if a UI works well for a mouse, it also works well for touch.
- Do assume that if a UI works well for a finger, it also works well for a pen.
- Don't depend on hover; it works for a mouse and a pen, but not for a finger.
- Don't depend on the touch pointer to fix touch UI problems.
I've been predicting the rise of tablets and touch for about 15 years, so take what I say with a grain of salt, but I think it's finally real. Those millions of iPhone users can't be wrong, can they?