Carnegie Mellon’s Wearable Computers project is helping to define the future for not only computing technologies but also for the use of computers in daily activities.
The goal is to develop a new class of computing systems with a small footprint that can be carried or worn by a person and be able to interact with computer-augmented environments. Since users are an integral part of the system, techniques such as user centered design, rapid prototyping, and in-field evaluation are used to identify and refine user interface models that are useful across a wide spectrum of applications. Over two dozen wearable computers have been designed and built over the last decade, with most tested in the field.
The application domains range from inspection, maintenance, manufacturing, and navigation to on-the-move collaboration, position sensing, and real-time speech recognition and language translation. At the core of these paradigms is the notion that wearable computers should seek to merge the user’s information with his or her work space.
The wearable computer must blend seamlessly with existing work environments, providing as little distraction as possible. This requirement often leads to replacements for the traditional desktop paradigm, which generally require a fixed physical relationship between the user and devices such as a keyboard and mouse. Identifying effective interaction modalities for wearable computers, as well as accurately modeling user tasks in software, are among the most significant challenges in designing wearable systems.
The goals for this paper are: present a map of wearable system functionality to application types and to summarize examples of four user interface models.