Designing for Gesture and Tangible Interaction. Mary Lou Maher

Designing for Gesture and Tangible Interaction - Mary Lou Maher


Скачать книгу
design thinking as a precursor to designing tangible interaction; they found that design thinking is dependent on gesturing with objects, and recommend that the design of tangible devices consider a tradeoff between exploiting the ambiguous and varied affordances of specific physical objects. The affordances of design tools facilitate specific aspects of designing. As we move away from the traditional WIMP (Windows, Icons, Menus, and Pointer) interaction, we encounter new kinds of affordances in interactive digital design tools (Burlamaqui and Dong, 2015). Tangible interaction design takes advantage of natural physical affordances (Ishii and Ullmer, 1997) by exploiting the knowledge that people already have from their experience with nondigital objects to design novel forms of interacting and discovering. In this book, we focus on the affordances of the interaction that can be sensed by the interactive devices. Well-designed objects make it clear how they work just by looking at them. The successful design of embodied interaction systems does not ignore the affordances of the physical and visual aspects of the system.

      While affordances of physical objects are closely related to our experience with their physical properties, the properties of tangible interaction objects have both physical and digital relationships. In contrast to physical objects, on-screen objects are clusters of pixels without a physical dimension. A common way to create the appearance of physical affordances to on screen objects is the use of metaphor in designing interface elements (Szabó, 1995). By creating a visual reference on screen to familiar physical objects, the on-screen objects take on some of the affordances of the metaphorical

      object (Mohnkern, 1997).

      The use of a metaphor during design makes familiar that which is unknown or unfamiliar by connecting it with the user’s previous experience (Dillon, 2003). The most well-known is the “desktop metaphor” used in current operating systems. Another common example of metaphor is the trash can. You can grab a file with the mouse to take it above the trash can and release it. A designer can use the shape, the size, the color, the weight, and the texture of the object to invoke any number of metaphorical links (Fishkin, 2004).

      Metaphors are an important concept for embodied interaction. An interaction model based on embodied metaphors effectively implements a mapping between action and output that is consistent with the metaphorical object. Through design, we can map human behaviors and bodily experiences onto abstract concepts in interactive environments (Bakker et al., 2012). Metaphor gives users a known model for an unknown system. Metaphor can help ease the transition to a new situation, so it is good for creating systems that will be used primarily by novices, like public displays. For embodied interaction design, in which there are few standards and fewer user manuals, the role of metaphor in design may be critical in creating easily discovered and learnable interactive systems.

      Epistemic action is exploratory motor activity aimed at uncovering information that is hard to compute mentally. Kirsh and Maglio (1994) distinguish between epistemic and pragmatic actions. A pragmatic action is the action needed to actually perform the task. Epistemic actions are actions that help the person explore the task and guide them to the solution. As such, epistemic actions enable the person to use physical objects and their environment to aid their cognition (Kirsh and Maglio, 1994; van den Hoven and Mazalek, 2011). Therefore, having a variety of tangible objects and physical arrangements may aid problem solving while interacting with a tangible interactive system. Fitzmaurice (1996) discussed the concepts of pragmatic and epistemic actions to provide the underlying theoretical support for workable graspable user interfaces (UIs). Pragmatic action refers to performatory motor activity that directs the user toward to the final goal. Epistemic action refers to exploratory motor activity that may uncover hidden information that would otherwise require a great deal of mental computation.

      Kim and Maher (2007) found an increase of epistemic actions in a design task while using a tangible UI, and through a protocol analysis, were able to also observe an increase in the cognitive processes typically associated with creative design. The projects in this book build on that result to design tangible interfaces based on physical objects that offer more opportunities for epistemic (i.e., exploratory) actions than pragmatic (i.e., performatory) actions. Exploration through epistemic actions enables a better perception of the environment and supports learning more about the properties of the objects. When designing gesture-based interaction, the process of discovering the interaction model can be leveraged by encouraging and responding to epistemic actions.

      In this book we present tangible and gesture interaction design with an underlying assumption that embodiment, affordances, metaphor, and epistemic actions are critical cognitive issues that can influence the quality of the design. If the interaction design is not well conceived with respect to these cognitive issues, users suffer from frustration, discomfort, stress, and fatigue. Applying appropriate design methods is crucial and should help bridge the differences between the designer’s view of the system and user’s mental model. It is important to conduct user research to know how to incorporate the insights from users’ experiences into the design. In this book, various user research and design methods such as gesture elicitation, protocol analysis, heuristic evaluation, prototyping, body-storming, role-playing, personas, and image boards are described to show how designers understand the potential user mental models of the interactive system. We describe these methods in the context of their use in the four design examples: Tangible Keyboard, Tangible Models, walk-up-and-use information display, and the willful marionette.

      This book can provide HCI practitioners and researchers with new principles for better designs and new ideas for research in embodied interaction. For HCI practitioners, the book describes specific design projects and the methods used during design and evaluation. These methods are specific to designing for tangible and gesture interaction. The description of these methods will help practitioners understand how these methods are applied, and, when appropriate, how these methods are uniquely suited to tangible and gesture interaction. For the HCI researcher, the book identifies the cognitive and design research issues that are raised when designing for tangible and gesture interaction. Many of the methods described in the design projects are also applicable in a research context.

      The organization of this book is as follows: Chapter 2 presents the concepts and significance of tangible interaction design. In Chapter 3, we present a description of our experience in designing the Tangible Keyboard and Tangible Models. Gesture interaction design is presented in terms of the technology and significance in Chapter 4. We follow this with a description of our experience in designing the walk-up-and-use information display and the willful marionette in Chapter 5. In Chapter 6, we conclude with our understanding of the research challenges in designing for embodied interaction design.

      CHAPTER 2

       Tangible Interaction Design

       2.1 WHAT IS A TANGIBLE INTERACTION?

      Tangible User Interfaces (TUIs) have emerged as an interface and interaction style that links the digital and physical worlds (Ullmer and Ishii, 2000; Shaer and Hornecker, 2010). An early definition of tangible interaction was introduced by Ishii and Ullmer (1997) as an extension of the idea of graspable user interfaces (UIs): they argued that tangible interaction allows users to grasp and manipulate bits by coupling digital information with physical objects and architectural surfaces.

Image

      Figure 2.1: Graspable object. Based on Fitzmaurice (1996, p. 4).

      TUIs employ physical objects with a direct correlation to digital objects as an alternative to traditional computer input and output


Скачать книгу