Display devices are essential for the computer-based rendering, i.e. presentation and output of data and information. From binary pixels to photorealistic three-dimensional images, visual data can, for example, be rendered via visual displays (e.g. monitors or screens of computers, smart phones, or e-book readers). Similarly, auditory displays render corresponding output, ranging from simple binary signals to multi-channel surround sound. In this context, the term Haptic Displays refers to interfaces delivering haptic feedback, typically by stimulating somatic receptors to generate a sensation of touch. The rendered feedback is generally categorized into cutaneous as well as kinesthetic or proprioceptive feedback, respectively. The former concerns sensations via skin receptors (mechanoreceptors), while the latter deals with stimulation of receptors in muscles, tendons and joints (proprioceptors).
Contents |
Some of the earliest displays capable of generating haptic signals have been developed even before the emergence of digital electronic computers. Already in the 1920s, Gault described a device to transform speech into vibrational stimuli applied to the skin. The method enabled subjects to distinguish colloquial sentences and certain vowels. His “teletactor”, a multi-vibration unit delivering stimuli to five fingers, allowed subjects with auditory disability to perceive speech, music, and other sounds (Gault, 1927). This is a typical example of sensory substitution, where one sensory stimuli is replaced with another, e.g. sound to tactile substitution. In general, the development of devices and mechanisms to assist persons with visual or hearing impairment has been a driving force in the emergence of tactile devices in the 20th century. In this context, devices to display Braille – a widely employed tactile writing system – via a refreshable output have played an important role. Related to this, in the 1960s the Optacon (Optical to Tactile Converter) was developed to enable visually impaired users to access tactile representations of black-and-white images captured by the system (Linvill and Bliss, 1966). The device comprised of a photoelectric sensor and a 24 x 6 array of piezo-electrically driven tactile pins. While the Optacon presented tactile images of small areas, the Tactile Vision Sensory Substitution system developed by Bach-Y-Rita et al. provided a wider range of tactile images. For these, solenoid stimulators were employed that provided stimuli according to the visual image of a video camera (Bach-Y-Rita et al., 1969).
Also the first devices akin to kinesthetic displays operated without computers. In the 1950s, Goertz developed at the Argonne National Lab the first master/slave remote-operation manipulator designed to safely handle radioactive material (Goertz and Thompson, 1954). The mechanism was capable of providing force feedback from the remote site to an operator. The first notable initiatives employing computer-based haptic rendering have been the series of GROPE projects, which started in the 1960s and extended to the 1980s (Brooks et al., 1990). In fact, one of the devices employed for haptic display was the above mentioned Argonne Remote Manipulator. The projects focused on the rendering of force fields, for instance in the context of molecular docking. Feedback was initially provided two, and ultimately in six degrees of freedom.
Another event with a notable influence on the recent considerable growth of the field of haptics, has been the emergence of commercially available haptic displays in the 1990s. Examples are the PHANTOM device (Massie and Salisbury,1994), providing three degrees of freedom of force display, the Impulse Engine, which was optimized for haptic feedback in medical simulations (Jackson and Rosenberg,1995), as well as the CyberTouch and CyberGrasp gloves; the former providing vibration feedback to the fingertips, and the latter employing exoskeletal actuation to display forces on the fingers. The wider availability of these and similar haptic displays led to a growth in haptics research addressing questions of perception, control, rendering, and applications in haptics. In turn, new results also led to an acceleration of the development of new haptic displays. More in-depth coverage of the historical steps of haptic displays can be found, for instance, in (Burdea, 1999) and (Stone, 2001).
Haptic displays can be categorized according to different characteristics and metrics. As already utilized in the previous section, an oft-employed discrimination is by feedback type, i.e. kinesthetic vs. tactile displays (often also complemented with a hybrid, i.e. both kinesthetic and tactile category). This is often broken down further by considering the device attachment (body- vs. ground-mounted) (see e.g. Kurfess, 2005) or the device portability. A similar classification is given in (Vu and Proctor, 2011), who consider input vs. output, tactile vs. force feedback, as well as location of the feedback (e.g. hand, arm, body). Recently, Gonzalez et al. (2013) proposed a “Hand Interaction Tree”, based on which haptic displays can also be categorized. They focused on examining the complexity and efficiency of spatial and temporal interaction on the hand. Similar notions concerning the study of hand grasps and manipulation tasks were also discussed in (Dollar, 2014). In the following a number of key characteristics, and thus associated categories, of haptic displays will be outlined.
Haptic displays can be categorized by the type of stimuli/output they generate, and correspondingly by the type of sensory receptors that are stimulated. Kinesthetic display is concerned with the rendering of forces. The latter are generated based on computational models, remote interaction, recordings or data-driven approaches. Displays in this category can be further subdivided according to the type of actuation (i.e. hardware) employed for force generation, such as electric motors, pneumatic/hydraulic actuators, etc. A large number of kinesthetic displays are actuated using electric motors (e.g. see Massie and Salisbury, 1994; Campion et al., 2005; Lee at al., 2010). This is due to the latter being robust and compact, while offering a high bandwidth, as well as being easy to control. In general they exhibit high torque-to-power as well as torque-to-inertia ratios. Nevertheless, the provided torques are often smaller than those obtainable with actuation based on circulating fluids. The latter comprise hydraulic and pneumatic actuators. Hydraulic actuation has for instance been employed to build haptic displays capable of generating high forces (see e.g. Frey et al., 2008; Lee and Ryu, 2008). Along these lines, also pneumatic actuation has been used, for instance to drive pistons in an exoskeletal glove displaying forces to the fingers (Bouzit et al., 2002). Another alternative is the use of magnetic levitation for haptic display (Berkelman, 2007). Related to the notion of actuation principle is also the applied approach of transmission for generating feedback, as well as the kinematic design (parallel vs. serial). This is for instance addressed in (Massie and Salisbury, 1994, Liu et al., 2014).
The force feedback rendered on kinesthetic displays often tries to represent objects or physical phenomena (e.g. force fields). Concerning the former, the output comprises at least of the object shape as well as often its compliance. Related to this is the notion of the degrees of freedom for in- and output. For instance, three degrees of freedom of force output allow the display of forces in three dimensions arising from point-contacts on the surface of a spatial object representation. In such a setting, feedback is usually generated according to active user input, generated during active spatial exploration of a virtual object. However, display of haptic feedback can also have other purposes, such as the rendering of magnetic force fields or the display of abstract data.
A special subcategory of kinesthetic displays is concerned with proprioception. Typical applications are dealing with rehabilitation or skill transfer (see e.g. Lambercy et al., 2011; Bluteau et al. 2008). The main idea is to provide forces guiding or assisting a user. The employed devices often take the form of an exoskeletal setup(see Figure 1) or an end-effector of a haptic mechanism attached to a limb (Nef et al., 2009; Ozkul et al., 2012).
Kinesthetic displays provide stimuli mainly perceived through receptors located inside of muscles, tendons, and joints (proprioceptors). In contrast to this, tactile displays focus on stimulation of the mechanoreceptors found inside the skin. The different types of cutaneous (skin) receptors are sensitive to vibration, stretch, pressure, deformation, etc. and thus, for instance, stimulated by surface textures. This is complemented by further receptors sensitive to temperature as well as pain stimuli. Tactile displays are commonly categorized either according to the tactile sensation they provide or to the type of employed actuation. A well-known example of the use of tactile displays is the vibration motors found in most mobile phones. Various technological solutions for providing vibration sensations to the skin have been realized, such as eccentric rotating mass actuation, linear resonant actuation, piezoelectric actuators, shape-memory alloys, or electro-active polymers. Other stimuli, such as skin stretch or local deformation can be generated, for instance, by air-jet displays, pneumatic balloons (see e.g. Santos-Carreras, et al., 2010), pin array displays (e.g. actuated by solenoids or linear motors, see Kyung and Lee, 2009) as shown in Figure 2 (Furrer, 2011) or rheological fluids (see e.g. Lee and Jang, 2011). Furthermore, temperature stimuli are typically realized with Peltier-elements(Jones and Ho, 2008).
Finally, also the notion of haptic illusions has to be addressed in this context. Similar to optical illusions, the former relate to stimuli that are perceived differently than actually provided physical stimuli. For instance, low frequency vibrations at about 80 Hz applied to a tendon result in the illusory percept of muscle elongation (Roll et al., 1989). Various other tactile illusions have been reported (e.g. cutaneous rabbit illusion, tau effect, kappa effect, etc., see Helson, 1930; Geldard et al., 1972; Bekesy, 1958; Sherrick and Rogers, 1966) and others are an active field of current research.
Unlike visual and auditory stimuli, most haptic signals do not travel through air and require direct contact with a user´s body to stimulate the haptic sensory receptors (mechanoreceptors and proprioceptors). Thus, the majority of haptic displays are contact-type interfaces. The contact with a user´s body can differ in size, location, attachment, etc. It can be realized via tools, held in a hand, that are attached to a haptic mechanism, exoskeletal mechanisms that are attached to body parts, or direct application of actuators to the skin.
Overall peripheral force feedback interfaces with a joystick, pen-type/sphere type end-effector are sorted as a tool-based haptic display (Massie and Salisbury, 1994 see Figure 3; Jackson and Rosenberg,1995; Grange et al., 2001). The tool-based display can be effective when generating haptic feedback of analogous circumstance simulations such as needle insertion, dental training or surgical simulation of minimally invasive surgery. Whereas various kinesthetic devices are based on tool-based interface, there are few tool-based tactile displays to deliver tactile stimuli mainly through a tool instead of direct contact with actuators. McMahan and Kuchenbecker developed a stylus comprised of a voice coil actuator (i.e. Haptuator) on top of the handle (McMahan and Kuchenbecker, 2014).
Exoskeleton type haptic displays are often wearable system (Frisoli et al. 2009; Fontana et al. 2013). Thus it is possible to afford wider workspaces and elaborated haptic feedback during dexterous manipulation of a virtual object (e.g. CyberGrasp™). However, the complexity of donning is a drawback compared to the tool-based haptic displays.
Majority of tactile displays directly stimulate the skin. Wearable tactile displays designed to a headband, wrist band, arm band, glove, vest, glasses, or belt enable to receive haptic stimuli passively (Van Erp et al., 2005; Kim et al., 2009; Kajimoto et al., 2006; Jones et al., 2006). Handheld displays of direct skin stimulation are often contact localized area such as fingertips (Pasquero et al., 2007).
Nevertheless, while most haptic displays rely on direct contact with body parts to stimulate the receptors, some haptic actuation principles allow for the generation of non-contact haptic stimulation. Recently, there has been an increased interest in these approaches. For instance, air-jets are a comparatively simple technical solution to generate non-contact haptic feedback(Tsalamlal et al., 2013; Kim et al., 2008 see Figure 4). However, it is difficult to create complex haptic sensations, and the range of haptic interaction is limited due to dissipation effects. A more advanced strategy has recently been reported in (Gupta et al., 2013; Sodhi et al., 2013). They employ pneumatics to create air vortices, based on which haptic feedback is generated over a distance. Another non-contact approach is based on focused ultrasound waves (see e.g. Hoshi et al., 2010; Carter et al., 2013). The key idea is to employ acoustic radiation pressure to stimulate the skin at a distance. Solutions employing a 2D grid of ultrasound transducers have been employed, however, recently reported developments even tried to extend this to a volumetric interaction space fully enclosed by transducers. These non-contact haptic displays provide stimuli directly to the body over distance, and have the potential for new applications of haptic display and interaction.
Currently, the arguably most widely used haptic displays are mobile phones. In the first phone generations, simple vibratory stimuli perceivable through the entire case were employed to display alerts. With the advent of the current phone generation employing touch screens, there has been an increased interest in providing richer and more localized haptic feedback, possibly also tightly coupled with the interaction of a user. In this regard, for instance tactile pattern design tools have been developed allowing for the creation of more complex vibration patterns that exhibit complex temporal/intensity variations, thus going beyond simple On/Off signals.
Fueled by the large prevalence of tablets and smart phones, a focus of recent research and development has been on providing more sophisticated tactile feedback during active exploration on a touch surface with one or more fingers. In this regard, Winfield et al. (2007) introduced the TPad; in this device a piezoelectric disk is employed to create ultrasonic vibration waves, which allow to modulate the coefficient of friction according to the squeeze film effect. Various haptic sensations (e.g. bumps, textures, geometry, and edges) can be generated via this approach, however, active user movement is required for the sensations to appear. The technology has recently also been extended to provide feedback on commercially available tablets (see TPad Tablet Project). Related to this, another strategy is the use of electrostatic vibration. As implemented by Bau et al. (2010), tactile information can be displayed on a surface by controlling friction through the electro-vibration principle. By applying a periodic voltage on an electrode covered by an insulator, an electrostatic attractive force is generated between a sliding finger and the insulator plate. The aforementioned approaches of providing tactile signals on flat displays is often also referred to as “surface haptics”. In the past few years, the first commercial devices (e.g. Senseg FeelScreen™, Esterline and Pacinian - HapticTouch™) employing the previously described techniques have started to emerge.
Other strategies of providing haptic feedback on touch screens have also recently been reported. For instance, Rantala et al. (2009) employed a piezoelectric actuator placed underneath a touchscreen of a commercial tablet, to render six braille dots on the device. This was used to make the touch screen device accessible to visually impaired persons. Further, also the use of microfluidics in a deformable tactile display has been suggested, in order to create physically embossed buttons on a screen (e.g. Tactus Tactile Layer™).
Over the last few decades, more and more haptic displays have appeared, in synergy with the development of new actuation and sensing technologies, mechanical designs, and control strategies in parallel. The use of haptic displays has been suggested in a wide range of diverse applications, such as education, entertainment, simulation and training, rehabilitation, tele-operation, assistive technology, prototyping, etc. However, except for the widely prevalent basic vibration actuators in mobile phones, more sophisticated haptic displays have not yet spread into everyday use. This can often be attributed to involved costs and considerations of robustness as well as usability. Haptic feedback is often more compelling in combination with matching auditory and visual stimuli, for instance applied to enhance interaction in virtual or augmented realities. Yet, there is currently no multi-purpose haptic display available that is capable of generating highly realistic sensations of touch. In this regard, the technology is still lagging behind visual and auditory displays. Nevertheless, in some clearly-defined cases plausible haptic feedback can be displayed. In comparison to the other sensory channels, receptors of haptic stimuli are diverse and spread over the whole body. However, haptic exploration of the environment is typically carried out with the hand; and is associated with both tactile and kinesthetic sensations. Therefore, the combination of kinesthetic with tactile displays has been suggested in the past as a possible approach to enhance overall realism. In this context, the notion of indirect tool-based vs. direct hand-based interaction has to be considered. Convincing future haptic displays will have to go beyond the tool-mediated rendering of forces and allow direct manual contact. In this regard, the recent developments of surface haptics as well as the direction of non-contact displays may pave the way to more natural and intuitive haptic feedback. Still, the development of haptic displays has only just left its infancy, and considerable further developments still remain to achieve highly realistic haptic rendering.