The Handbook of Multimodal-Multisensor Interfaces, Volume 1. Sharon Oviatt
Oliver S. Schneider, Hasti Seifi
3.1 Introduction
Our haptic sense comprises both taction or cutaneous information obtained through receptors in the skin, and kinesthetic awareness of body forces and motions. Broadly speaking, haptic interfaces to computing systems are anything a user touches or is touched by, to control, experience, or receive information from something with a computer in it. Keyboard and mouse, a physical button on a kitchen blender, and the glass touchscreen on your smartphone are energetically passive haptic interfaces: no external energy is pumped into the users’ body from a powered actuator. Most readers will have encountered energetically active haptic feedback as a vibrotactile (VT) buzz or forces in a gaming joystick, a force feedback device in a research lab, or a physically interactive robot. Much more is possible.
When we bring touch into an interaction, we invoke characteristics that are unique or accentuated relative to other modalities. Like most powerful design resources, these traits also impose constraints. The job of a haptic designer is to understand these “superpowers” and their costs and limits, and then to deploy them for an optimally enriched experience.
Both jobs are relatively uncharted, even though engineers have been building devices with the explicit intention of haptic display for over 25 years, and psychophysicists have been studying this rich, complex sense for as many decades. What makes it so difficult? Our haptic sense is really many different senses, neurally integrated; meanwhile, the technology of haptic display is anything but stable, with engineering challenges of a different nature than those for graphics and sound. In the last few years, technological advances from materials to robotics have opened new possibilities for the use of energetically active haptics in user interfaces, our primary focus here. Needs are exposed at a large scale by newly ubiquitous technology like “touch” screens crying out for physical feedback, and high-fidelity virtual reality visuals that are stalled in effectiveness without force display.
Glossary
Active [human sensing]: On the human side, active sensing entails deliberate and attentionally focused seeking of information through the haptic sense, usually combined with motor movement. People use different exploratory procedures to examine properties of objects (e.g., weight, texture, shape) [Klatzky et al. 2013, Lederman and Klatzky 1987].
Ambient interfaces. Information displays that operate in the user’s attentional periphery [Weiser and Brown 1996], only moving into awareness either when they increase in salience because of urgency, or when the user chooses to focus on them.
Crowdsourcing. The leveraging of large communities of users to perform computation, generate ideas, or provide feedback on media [Kittur et al. 2008]. For example, many researchers and UX designers use online tools such as Amazon’s Mechanical Turk (http://www.mturk.com) to quickly gather feedback on designs or questions that can be shared visually.
Cutaneous sensations come from the skin and can include vibration, touch, pressure, temperature, and texture [Lederman and Klatzky 1987].
Design activity. A collection of related tasks performed during media design that can help when thinking about design. We suggest browse, sketch, refine, and share as distinct activities or stages of haptic making.
Energetically active [haptic display]: In contrast, to energetically passive displays an energetically active display can be nonconservative, depending on its control law, and has the capacity to pump more energy (sourced from a wall plug or battery) into the interaction than it takes out. This can manifest as instability such as jitter and growing oscillations.
Energetically passive [haptic display]: On the machine side, a display is energetically passive if it is “conservative”—i.e., it puts no more energy into the interaction than it takes out [Colgate and Brown 1994]. A trivial example is a device without access to external or long-term stored power: for example, when you compress a spring, the device stores only the energy you place into it, and when you release the spring, this simple interface restores the same energy back to your hand that you put into it. Such a device will not be unstable or jittery; and thus, to say that a haptic display feels “passive” is usually a positive. A brake is one kind of (potentially) powered haptic display that cannot, by design, ever be active: it can only remove energy from the interaction, never add to it, and thus while it is limited in what it can do, it usually feels steady and stable.
Facet. A set of related properties describing an aspect of an object [Fagan 2010]. In haptics, multiple facets can be used in combination to capture different cognitive schemas that people unconsciously use to describe and make sense of haptic signals. For example, to describe vibrations, people might at different moments choose to use physical, sensory, emotional words, metaphors or usage examples.
Force feedback usually involves displays that move and can push or pull on part of the body. They generally need to be grounded against something—a table, a car chassis, or another part of the user’s body—to provide this resistance or motive force.
Haptic is a term referring to both cutaneous sensations gained from the skin, also referred to as tactile feedback, and the kinesthetic sense, which involves internal signals sent from the muscles and tendons about the position and movement of a limb [Goldstein 1999, Lederman and Klatzky 1997].
Haptic feedback comprises devices that display to either the kinesthetic and cutaneous senses.
Haptic icons are different terms used to refer to structured abstract messages (tactile or force) that encode information [Maclean and Enriquez 2003, Brewster and Brown 2004]. More specific terms refer to such encoding in haptic submodalities.
Haptic interfaces are devices that display force feedback or tactile feedback in the course of an interaction.
Haptic phonemes. See haptic icon.
Haptic vocabulary. A set of haptic signals paired with their meanings, which as a group convey a set of application-related information elements to users. To be usable and learnable, a haptic vocabulary will have some kind of structure or naturally apparent meaning that a user can scaffold to quickly learn more elements once the first few have been understood [MacLean 2008b].
Individual differences. Variation among users in sensing, interpreting and valuing a haptic signal.
Kinesthetic signals are sent from muscles and tendons. They include force production, body position, limb direction, and joint angle [Goldstein 1999, Lederman and Klatzky 1997].
Passive [human sensing]: In contrast, to active human sensing, in passive sensing the recipient feels a touch that has not been sought and may not be anticipated. Its interpretation is thus not framed by intent or an exploratory purpose, and may be experienced and interpreted differently. In design terms, active touch may yield better information transfer, but requires both a higher level of cognitive engagement and access to the display with a body element that can explore, such as a finger.
Schema. An existing mental structure or set of ideas that can be used to make sense of, interpret, or frame design for a haptic sensation, e.g., recognizing two pulses as a heartbeat [Fagan 2010, Seifi et al. 2015].
Tactile icon. See haptic icon.
Tactile feedback comprises devices that render a percept of the cutaneous sense: for example, using vibration, temperature, texture, or other material properties to encode information. This term is often used interchangeably with more specific types of tactile feedback, e.g., vibrotactile feedback and thermal feedback.
Tacton. See haptic icon.
Thermal feedback