With UsersFirst’s VisualMark, screen and video capture is accomplished without installing any software on the target machine. The tools run directly from a CD-ROM on the target machine, and they relay streams to the Mac-based portable lab. TechSmith’s Morae requires installation of a recorder program, which can then be started manually by the user or be configured to begin recording at a certain time or in response to a system event such as an application launch.
VisualMark relays both streams -- screen activity and video of the user -- to the portable lab for live observation. According to Pete Gordon, the developer of VisualMark, this method most closely approximates the fixed-lab environment in which observers may be watching subjects through a one-way mirror.
Morae’s remote viewer enables the observer to watch screen activity on the target machine but not live video of the user. Instead, it captures both streams to the user’s drive (or a LAN drive) for later analysis. Morae’s product manager, Shane Lovellette, notes that an out-of-band speakerphone can be used to relay at least the voice channel in a situation where the user is invited to think aloud while performing a scenario.
Both products enable the usability analyst to identify key interaction sequences, annotate them, and distill them into a series of highlights. The final output is a reel of short clips that shows how and why users go astray, and it provides a framework for discussing alternate scenarios.
Multiple search capacity
In both cases, markers with annotations can be inserted during or after the recording session. Indexing the streams in this way prepares them for editing. Morae, in addition, captures window, mouse, keyboard, and Web-page events in a synchronized way. An analyst trying to find the point at which a user confirms a dialog box can, for example, search for the “OK” event. There’s also a companion text-search feature that enables the analyst to jump to the point at which a user types, for example, a ZIP code.
Morae’s “rich recording” technology not only instruments the streams for searching, it can also count the number of mouse clicks or Web-page views required to complete a task. As a result, usability analysts “can focus more on the qualitative side as they’re observing tests,” TechSmith’s Lovellette says.
Current trends in software development will, in theory, make it even easier to correlate the different mental models of users and developers. Mouse clicks and Windows events are useful reference points, but they correlate weakly to the scenarios that software implements and that users perform. In a services-oriented architecture, however, high-level scenarios need not be inferred from low-level events -- they can be seen directly in transparent XML pipelines.
Similarly, a user’s mental state might be observed directly rather than having to be inferred from facial expressions and tone of voice. When a user experiences stress, for example, a synchronized biofeedback monitor could pinpoint its cause. “If there’s a heart rate spike,” Lovellette suggests, “you’d look for that point in the interaction and see what stimulus caused it.”
It’s fun to speculate about such possibilities, but what’s already clear is that by streamlining a formerly cumbersome process, this new breed of capture and analysis tools can help weave user research into the normal iterative flow of software development.