Haptic Displays

From Scholarpedia - Reading time: 15 min

Display devices are essential for the computer-based rendering, i.e. presentation and output of data and information. From binary pixels to photorealistic three-dimensional images, visual data can, for example, be rendered via visual displays (e.g. monitors or screens of computers, smart phones, or e-book readers). Similarly, auditory displays render corresponding output, ranging from simple binary signals to multi-channel surround sound. In this context, the term Haptic Displays refers to interfaces delivering haptic feedback, typically by stimulating somatic receptors to generate a sensation of touch. The rendered feedback is generally categorized into cutaneous as well as kinesthetic or proprioceptive feedback, respectively. The former concerns sensations via skin receptors (mechanoreceptors), while the latter deals with stimulation of receptors in muscles, tendons and joints (proprioceptors).

Contents

[edit] Early Development of Haptic Displays

Some of the earliest displays capable of generating haptic signals have been developed even before the emergence of digital electronic computers. Already in the 1920s, Gault described a device to transform speech into vibrational stimuli applied to the skin. The method enabled subjects to distinguish colloquial sentences and certain vowels. His “teletactor”, a multi-vibration unit delivering stimuli to five fingers, allowed subjects with auditory disability to perceive speech, music, and other sounds (Gault, 1927). This is a typical example of sensory substitution, where one sensory stimuli is replaced with another, e.g. sound to tactile substitution. In general, the development of devices and mechanisms to assist persons with visual or hearing impairment has been a driving force in the emergence of tactile devices in the 20th century. In this context, devices to display Braille – a widely employed tactile writing system – via a refreshable output have played an important role. Related to this, in the 1960s the Optacon (Optical to Tactile Converter) was developed to enable visually impaired users to access tactile representations of black-and-white images captured by the system (Linvill and Bliss, 1966). The device comprised of a photoelectric sensor and a 24 x 6 array of piezo-electrically driven tactile pins. While the Optacon presented tactile images of small areas, the Tactile Vision Sensory Substitution system developed by Bach-Y-Rita et al. provided a wider range of tactile images. For these, solenoid stimulators were employed that provided stimuli according to the visual image of a video camera (Bach-Y-Rita et al., 1969).

Also the first devices akin to kinesthetic displays operated without computers. In the 1950s, Goertz developed at the Argonne National Lab the first master/slave remote-operation manipulator designed to safely handle radioactive material (Goertz and Thompson, 1954). The mechanism was capable of providing force feedback from the remote site to an operator. The first notable initiatives employing computer-based haptic rendering have been the series of GROPE projects, which started in the 1960s and extended to the 1980s (Brooks et al., 1990). In fact, one of the devices employed for haptic display was the above mentioned Argonne Remote Manipulator. The projects focused on the rendering of force fields, for instance in the context of molecular docking. Feedback was initially provided two, and ultimately in six degrees of freedom.
Another event with a notable influence on the recent considerable growth of the field of haptics, has been the emergence of commercially available haptic displays in the 1990s. Examples are the PHANTOM device (Massie and Salisbury,1994), providing three degrees of freedom of force display, the Impulse Engine, which was optimized for haptic feedback in medical simulations (Jackson and Rosenberg,1995), as well as the CyberTouch and CyberGrasp gloves; the former providing vibration feedback to the fingertips, and the latter employing exoskeletal actuation to display forces on the fingers. The wider availability of these and similar haptic displays led to a growth in haptics research addressing questions of perception, control, rendering, and applications in haptics. In turn, new results also led to an acceleration of the development of new haptic displays. More in-depth coverage of the historical steps of haptic displays can be found, for instance, in (Burdea, 1999) and (Stone, 2001).

[edit] Categories of Haptic Displays

Haptic displays can be categorized according to different characteristics and metrics. As already utilized in the previous section, an oft-employed discrimination is by feedback type, i.e. kinesthetic vs. tactile displays (often also complemented with a hybrid, i.e. both kinesthetic and tactile category). This is often broken down further by considering the device attachment (body- vs. ground-mounted) (see e.g. Kurfess, 2005) or the device portability. A similar classification is given in (Vu and Proctor, 2011), who consider input vs. output, tactile vs. force feedback, as well as location of the feedback (e.g. hand, arm, body). Recently, Gonzalez et al. (2013) proposed a “Hand Interaction Tree”, based on which haptic displays can also be categorized. They focused on examining the complexity and efficiency of spatial and temporal interaction on the hand. Similar notions concerning the study of hand grasps and manipulation tasks were also discussed in (Dollar, 2014). In the following a number of key characteristics, and thus associated categories, of haptic displays will be outlined.

[edit] Kinesthetic vs. tactile displays

Haptic displays can be categorized by the type of stimuli/output they generate, and correspondingly by the type of sensory receptors that are stimulated. Kinesthetic display is concerned with the rendering of forces. The latter are generated based on computational models, remote interaction, recordings or data-driven approaches. Displays in this category can be further subdivided according to the type of actuation (i.e. hardware) employed for force generation, such as electric motors, pneumatic/hydraulic actuators, etc. A large number of kinesthetic displays are actuated using electric motors (e.g. see Massie and Salisbury, 1994; Campion et al., 2005; Lee at al., 2010). This is due to the latter being robust and compact, while offering a high bandwidth, as well as being easy to control. In general they exhibit high torque-to-power as well as torque-to-inertia ratios. Nevertheless, the provided torques are often smaller than those obtainable with actuation based on circulating fluids. The latter comprise hydraulic and pneumatic actuators. Hydraulic actuation has for instance been employed to build haptic displays capable of generating high forces (see e.g. Frey et al., 2008; Lee and Ryu, 2008). Along these lines, also pneumatic actuation has been used, for instance to drive pistons in an exoskeletal glove displaying forces to the fingers (Bouzit et al., 2002). Another alternative is the use of magnetic levitation for haptic display (Berkelman, 2007). Related to the notion of actuation principle is also the applied approach of transmission for generating feedback, as well as the kinematic design (parallel vs. serial). This is for instance addressed in (Massie and Salisbury, 1994, Liu et al., 2014).

The force feedback rendered on kinesthetic displays often tries to represent objects or physical phenomena (e.g. force fields). Concerning the former, the output comprises at least of the object shape as well as often its compliance. Related to this is the notion of the degrees of freedom for in- and output. For instance, three degrees of freedom of force output allow the display of forces in three dimensions arising from point-contacts on the surface of a spatial object representation. In such a setting, feedback is usually generated according to active user input, generated during active spatial exploration of a virtual object. However, display of haptic feedback can also have other purposes, such as the rendering of magnetic force fields or the display of abstract data.

A special subcategory of kinesthetic displays is concerned with proprioception. Typical applications are dealing with rehabilitation or skill transfer (see e.g. Lambercy et al., 2011; Bluteau et al. 2008). The main idea is to provide forces guiding or assisting a user. The employed devices often take the form of an exoskeletal setup(see Figure 1) or an end-effector of a haptic mechanism attached to a limb (Nef et al., 2009; Ozkul et al., 2012).

Figure 1: An exoskeletal interface. With kind permission of Dietmar Heinz

Kinesthetic displays provide stimuli mainly perceived through receptors located inside of muscles, tendons, and joints (proprioceptors). In contrast to this, tactile displays focus on stimulation of the mechanoreceptors found inside the skin. The different types of cutaneous (skin) receptors are sensitive to vibration, stretch, pressure, deformation, etc. and thus, for instance, stimulated by surface textures. This is complemented by further receptors sensitive to temperature as well as pain stimuli. Tactile displays are commonly categorized either according to the tactile sensation they provide or to the type of employed actuation. A well-known example of the use of tactile displays is the vibration motors found in most mobile phones. Various technological solutions for providing vibration sensations to the skin have been realized, such as eccentric rotating mass actuation, linear resonant actuation, piezoelectric actuators, shape-memory alloys, or electro-active polymers. Other stimuli, such as skin stretch or local deformation can be generated, for instance, by air-jet displays, pneumatic balloons (see e.g. Santos-Carreras, et al., 2010), pin array displays (e.g. actuated by solenoids or linear motors, see Kyung and Lee, 2009) as shown in Figure 2 (Furrer, 2011) or rheological fluids (see e.g. Lee and Jang, 2011). Furthermore, temperature stimuli are typically realized with Peltier-elements(Jones and Ho, 2008).

Figure 2: A braille display driven by Tiny Ultrasonic Linear Actuators.

Finally, also the notion of haptic illusions has to be addressed in this context. Similar to optical illusions, the former relate to stimuli that are perceived differently than actually provided physical stimuli. For instance, low frequency vibrations at about 80 Hz applied to a tendon result in the illusory percept of muscle elongation (Roll et al., 1989). Various other tactile illusions have been reported (e.g. cutaneous rabbit illusion, tau effect, kappa effect, etc., see Helson, 1930; Geldard et al., 1972; Bekesy, 1958; Sherrick and Rogers, 1966) and others are an active field of current research.

[edit] Contact vs. non-contact displays

Unlike visual and auditory stimuli, most haptic signals do not travel through air and require direct contact with a user´s body to stimulate the haptic sensory receptors (mechanoreceptors and proprioceptors). Thus, the majority of haptic displays are contact-type interfaces. The contact with a user´s body can differ in size, location, attachment, etc. It can be realized via tools, held in a hand, that are attached to a haptic mechanism, exoskeletal mechanisms that are attached to body parts, or direct application of actuators to the skin.

Overall peripheral force feedback interfaces with a joystick, pen-type/sphere type end-effector are sorted as a tool-based haptic display (Massie and Salisbury, 1994 see Figure 3; Jackson and Rosenberg,1995; Grange et al., 2001). The tool-based display can be effective when generating haptic feedback of analogous circumstance simulations such as needle insertion, dental training or surgical simulation of minimally invasive surgery. Whereas various kinesthetic devices are based on tool-based interface, there are few tool-based tactile displays to deliver tactile stimuli mainly through a tool instead of direct contact with actuators. McMahan and Kuchenbecker developed a stylus comprised of a voice coil actuator (i.e. Haptuator) on top of the handle (McMahan and Kuchenbecker, 2014).

Figure 3: A tool-based haptic display.

Exoskeleton type haptic displays are often wearable system (Frisoli et al. 2009; Fontana et al. 2013). Thus it is possible to afford wider workspaces and elaborated haptic feedback during dexterous manipulation of a virtual object (e.g. CyberGrasp™). However, the complexity of donning is a drawback compared to the tool-based haptic displays.

Majority of tactile displays directly stimulate the skin. Wearable tactile displays designed to a headband, wrist band, arm band, glove, vest, glasses, or belt enable to receive haptic stimuli passively (Van Erp et al., 2005; Kim et al., 2009; Kajimoto et al., 2006; Jones et al., 2006). Handheld displays of direct skin stimulation are often contact localized area such as fingertips (Pasquero et al., 2007).

Nevertheless, while most haptic displays rely on direct contact with body parts to stimulate the receptors, some haptic actuation principles allow for the generation of non-contact haptic stimulation. Recently, there has been an increased interest in these approaches. For instance, air-jets are a comparatively simple technical solution to generate non-contact haptic feedback(Tsalamlal et al., 2013; Kim et al., 2008 see Figure 4). However, it is difficult to create complex haptic sensations, and the range of haptic interaction is limited due to dissipation effects. A more advanced strategy has recently been reported in (Gupta et al., 2013; Sodhi et al., 2013). They employ pneumatics to create air vortices, based on which haptic feedback is generated over a distance. Another non-contact approach is based on focused ultrasound waves (see e.g. Hoshi et al., 2010; Carter et al., 2013). The key idea is to employ acoustic radiation pressure to stimulate the skin at a distance. Solutions employing a 2D grid of ultrasound transducers have been employed, however, recently reported developments even tried to extend this to a volumetric interaction space fully enclosed by transducers. These non-contact haptic displays provide stimuli directly to the body over distance, and have the potential for new applications of haptic display and interaction.


Figure 4: 5x5 and 3x3 arrays of fingertip air-jet displays.

[edit] Recent Trends in Haptic Display Development

Currently, the arguably most widely used haptic displays are mobile phones. In the first phone generations, simple vibratory stimuli perceivable through the entire case were employed to display alerts. With the advent of the current phone generation employing touch screens, there has been an increased interest in providing richer and more localized haptic feedback, possibly also tightly coupled with the interaction of a user. In this regard, for instance tactile pattern design tools have been developed allowing for the creation of more complex vibration patterns that exhibit complex temporal/intensity variations, thus going beyond simple On/Off signals.

Fueled by the large prevalence of tablets and smart phones, a focus of recent research and development has been on providing more sophisticated tactile feedback during active exploration on a touch surface with one or more fingers. In this regard, Winfield et al. (2007) introduced the TPad; in this device a piezoelectric disk is employed to create ultrasonic vibration waves, which allow to modulate the coefficient of friction according to the squeeze film effect. Various haptic sensations (e.g. bumps, textures, geometry, and edges) can be generated via this approach, however, active user movement is required for the sensations to appear. The technology has recently also been extended to provide feedback on commercially available tablets (see TPad Tablet Project). Related to this, another strategy is the use of electrostatic vibration. As implemented by Bau et al. (2010), tactile information can be displayed on a surface by controlling friction through the electro-vibration principle. By applying a periodic voltage on an electrode covered by an insulator, an electrostatic attractive force is generated between a sliding finger and the insulator plate. The aforementioned approaches of providing tactile signals on flat displays is often also referred to as “surface haptics”. In the past few years, the first commercial devices (e.g. Senseg FeelScreen™, Esterline and Pacinian - HapticTouch™) employing the previously described techniques have started to emerge.

Other strategies of providing haptic feedback on touch screens have also recently been reported. For instance, Rantala et al. (2009) employed a piezoelectric actuator placed underneath a touchscreen of a commercial tablet, to render six braille dots on the device. This was used to make the touch screen device accessible to visually impaired persons. Further, also the use of microfluidics in a deformable tactile display has been suggested, in order to create physically embossed buttons on a screen (e.g. Tactus Tactile Layer™).


[edit] Conclusion

Over the last few decades, more and more haptic displays have appeared, in synergy with the development of new actuation and sensing technologies, mechanical designs, and control strategies in parallel. The use of haptic displays has been suggested in a wide range of diverse applications, such as education, entertainment, simulation and training, rehabilitation, tele-operation, assistive technology, prototyping, etc. However, except for the widely prevalent basic vibration actuators in mobile phones, more sophisticated haptic displays have not yet spread into everyday use. This can often be attributed to involved costs and considerations of robustness as well as usability. Haptic feedback is often more compelling in combination with matching auditory and visual stimuli, for instance applied to enhance interaction in virtual or augmented realities. Yet, there is currently no multi-purpose haptic display available that is capable of generating highly realistic sensations of touch. In this regard, the technology is still lagging behind visual and auditory displays. Nevertheless, in some clearly-defined cases plausible haptic feedback can be displayed. In comparison to the other sensory channels, receptors of haptic stimuli are diverse and spread over the whole body. However, haptic exploration of the environment is typically carried out with the hand; and is associated with both tactile and kinesthetic sensations. Therefore, the combination of kinesthetic with tactile displays has been suggested in the past as a possible approach to enhance overall realism. In this context, the notion of indirect tool-based vs. direct hand-based interaction has to be considered. Convincing future haptic displays will have to go beyond the tool-mediated rendering of forces and allow direct manual contact. In this regard, the recent developments of surface haptics as well as the direction of non-contact displays may pave the way to more natural and intuitive haptic feedback. Still, the development of haptic displays has only just left its infancy, and considerable further developments still remain to achieve highly realistic haptic rendering.

[edit] References

  • Bach-Y-Rita, P; Collins, C C; Saunders, F A; White, B and Scadden, L (1969). Vision substitution by tactile image projection. Nature 221(5184): 963-964.
  • Bau, O; Poupyrev, I; Israr, A and Harrison, C (2010). TeslaTouch: Electrovibration for touch surfaces. In: Proceedings of the 23nd Annual ACM Symposium on User Interface Software and Technology (pp. 283-292).
  • Berkelman, P (2007). A novel coil configuration to extend the motion range of lorentz force magnetic levitation devices for haptic interaction. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, 2007. IROS 2007. (pp. 2107-2112).
  • Bekesy, G V (1958). Funneling in the nervous system and its role in loudness and sensation intensity on the skin. The Journal of the Acoustical Society of America 30(5): 399-412.
  • Bluteau, J; Coquillart, S; Payan, Y and Gentaz, E (2008). Haptic guidance improves the visuo-manual tracking of trajectories. PLoS One 3(3): e1775.
  • Burdea, G C (1999). Keynote address: Haptics feedback for virtual reality. In: Proceedings of International Workshop on Virtual Prototyping, Laval, France (pp. 87-96).
  • Brooks, F P, Jr.; Ouh-Young, M; Batter, J J and Kilpatrick, P J (1990). Project GROPEHaptic displays for scientific visualization. In: ACM SIGGraph Computer Graphics," Vol. 24, No. 4 (pp. 177-185).
  • Bouzit, M; Popescu, G; Burdea, G and Boian, R (2002). The rutgers master ii-nd force feedback glove. In: Proceedings of the 10th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 2002. HAPTICS 2002 (pp. 145-152).
  • Campion, G; Wang, Q and Hayward, V (2005). The pantograph Mk-II: A haptic instrument. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, 2005. (IROS 2005) (pp. 193-198).
  • Carter, T; Seah, S A; Long, B; Drinkwater, B and Subramanian, S (2013). Ultrahaptics: Multi-point mid-air haptic feedback for touch surfaces. In: Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology (pp. 505-514).
  • CyberGlove Systems LLC (2009). CyberTouchTM: Tactile feedback for the CyberGlove System. http://www.cyberglovesystems.com/sites/default/files/CyberTouch_Brochure_2009.pdf.
  • Dollar, A M (2014). Classifying human hand use and the activities of daily living. In: The Human Hand as an Inspiration for Robot Hand Development (pp. 201-216). Springer International Publishing.
  • Frey, M; Johnson, D E and Hollerbach, J (2008). Full-arm haptics in an accessibility task. In: Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 2008. HAPTICS 2008 (pp. 405-412). IEEE.
  • Fontana, M; Fabio, S; Marcheschi, S and Bergamasco, M (2013). Haptic hand exoskeleton for precision grasp simulation. Journal of Mechanisms and Robotics 5(4): 041014.
  • Freeman, E; Brewster, S and Lantz, V (2014). Tactile feedback for above-device gesture interfaces: Adding touch to touchless interactions. In: Proceedings of the 16th International Conference on Multimodal Interaction (pp. 419-426). ACM.
  • Frisoli, A; Salsedo, F; Bergamasco, M; Rossi, B and Carboncini, M C (2009). A force-feedback exoskeleton for upper-limb rehabilitation in virtual reality. Applied Bionics and Biomechanics 6(2): 115-126.
  • Furrer, J (2011). Augmented white cane II: Towards an effective electronic mobility aid for the blind. Master Thesis, Rehabilitation Engineering Lab, ETH Zurich & ZHAW, Switzerland.
  • Gault, R H (1927). “Hearing” through the sense organs of touch and vibration. Journal of the Franklin Institute 204(3): 329-358.
  • Geldard, F A and Sherrick, C E (1972). The cutaneous "rabbit": A perceptual illusion. Science 178(4057): 178-179.
  • Goertz, R C and Thompson, W M (1954). Electronically controlled manipulator. Nucleonics (pp. 46-47).
  • Gonzalez, F; Gosselin, F and Bachta, W (2013). A framework for the classification of dexterous haptic interfaces based on the identification of the most frequently used hand contact areas. In: World Haptics Conference (WHC), 2013 (pp. 461-466). IEEE.
  • Grange, S; Conti, F; Rouiller, P; Helmer, P and Baur, C (2001). Overview of the Delta Haptic Device. In: Proceedings of EuroHaptics ’01.
  • Gupta, S; Morris, D; Patel, S N and Tan, D (2013). Airwave: Non-contact haptic feedback using air vortex rings. In: Proceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing (pp. 419-428).
  • Helson, H (1930). The tau effect—an example of psychological relativity. Science 71(1847): 536-537.
  • Hoshi, T; Takahashi, M; Iwamoto, T and Shinoda, H (2010). Noncontact tactile display based on radiation pressure of airborne ultrasound. IEEE Transactions on Haptics 3(3): 155-165.
  • Jackson, B and Rosenberg, L (1995). Force Feedback and Medical Simulation. Interactive Technology and the New Paradigm for Healthcare (pp. 147-151). Amsterdam: IOS Press.
  • Jones, L A; Lockyer, B and Piateski, E (2006). Tactile display and vibrotactile pattern recognition on the torso. Advanced Robotics 20(12): 1359-1374.
  • Jones, L A and Ho, H N (2008). Warm or cool, large or small? The challenge of thermal displays. IEEE Transactions on Haptics 1(1): 53-70.
  • Kajimoto, H; Kanno, Y and Tachi, S (2006). Forehead electro-tactile display for vision substitution. In: Proceedings of EuroHaptics.
  • Kim, Y; Cha, J; Oakley, I and Ryu, J (2009). Exploring tactile movies: An initial tactile glove design and concept evaluation. MultiMedia, IEEE PP(99): 1.
  • Kim, Y; Oakley, I and Ryu, J (2008). Human perception of pneumatic tactile cues. Advanced Robotics 22(8): 807-828.
  • Kurfess, T R (Ed.). (2005). Robotics and Automation Handbook. CRC Press.
  • Kyung, K U and Lee, J Y (2009). Ubi-Pen: A haptic interface with texture and vibrotactile display. IEEE Computer Graphics and Applications (1): 56-64.
  • Lambercy, O; Robles, A J; Kim, Y and Gassert, R (2011). Design of a robotic device for assessment and rehabilitation of hand sensory function. In: 2011 IEEE International Conference on Rehabilitation Robotics (ICORR) (pp. 1-6).
  • Linvill, J G and Bliss, J C (1966). A direct translation reading aid for the blind. Proceedings of the IEEE 54(1): 40-51.
  • Liu, L; Miyake, S; Maruyama, N; Akahane, K and Sato, M (2014). Development of two-handed multi-finger haptic interface SPIDAR-10. In: Haptics: Neuroscience, Devices, Modeling, and Applications (pp. 176-183). Berlin Heidelberg: Springer.
  • Lee, L F; Narayanan, M S; Mendel, F; Krovi, V N and Karam, P (2010). Kinematics analysis of in-parallel 5 dof haptic device. In: 2010 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM) (pp. 237-241).
  • Lee, C H and Jang, M G (2011). Virtual surface characteristics of a tactile display using magneto-rheological fluids. Sensors 11(3): 2845-2856.
  • Lee, Y and Ryu, D (2008). Wearable haptic glove using micro hydraulic system for control of construction robot system with VR environment. In: IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, 2008. MFI 2008 (pp. 638-643).
  • McMahan, W and Kuchenbecker, K J (2014). Dynamic modeling and control of voice-coil actuators for high-fidelity display of haptic vibrations. In: 2014 IEEE Haptics Symposium (HAPTICS) (pp. 115-122).
  • Massie, T H and Salisbury, J K (1994). The phantom haptic interface: A device for probing virtual objects. In: Proceedings of the ASME Winter Annual Meeting, Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, Vol. 55, No. 1 (pp. 295-300).
  • Nef, T; Guidali, M and Riener, R (2009). ARMin III–arm therapy exoskeleton with an ergonomic shoulder actuation. Applied Bionics and Biomechanics 6(2): 127-142.
  • Neuroscience and Robotics Laboratory, Northwestern University (2015). TPad tablet. http://www.nxr.northwestern.edu/tpad-tablet.
  • Ozkul, F; Barkana, D E; Demirbas, S B and Inal, S (2012). Evaluation of elbow joint proprioception with RehabRoby: A pilot study. Acta Orthopaedica et Traumatologica Turcica 46(5): 332-338.
  • Pasquero, J et al. (2007). Haptically enabled handheld information display with distributed tactile transducer. IEEE Transactions on Multimedia 9(4): 746-753.
  • Rantala, J et al. (2009). Methods for presenting braille characters on a mobile device with a touchscreen and tactile feedback. IEEE Transactions on Haptics 2(1): 28-39.
  • Roll, J P; Vedel, J P and Ribot, E (1989). Alteration of proprioceptive messages induced by tendon vibration in man: A microneurographic study. Experimental Brain Research 76(1): 213-222.
  • Santos-Carreras, L; Leuenberger, K; Samur, E; Gassert, R and Bleuler, H (2012). Tactile feedback improves performance in a palpation task: Results in a VR-based testbed. Presence: Teleoperators and Virtual Environments 21(4): 435-451.
  • Sherrick, C E and Rogers, R (1966). Apparent haptic movement. Perception & Psychophysics 1(3): 175-180.
  • Sodhi, R; Poupyrev, I; Glisson, M and Israr, A (2013). AIREAL: Interactive tactile experiences in free air. ACM Transactions on Graphics (TOG) 32(4): 134.
  • Stone, R J (2001). Haptic feedback: A brief history from telepresence to virtual reality. In: Haptic Human-Computer Interaction (pp. 1-16). Berlin Heidelberg: Springer.
  • Tsalamlal, M Y; Ouarti, N and Ammi, M (2013). Psychophysical study of air jet based tactile stimulation. In: World Haptics Conference (WHC), 2013 (pp. 639-644). IEEE.
  • Van Erp, J B; Van Veen, H A; Jansen, C and Dobbins, T (2005). Waypoint navigation with a vibrotactile waist belt. ACM Transactions on Applied Perception (TAP) 2(2): 106-117.
  • Vu, K P L and Proctor, R W (Eds.) (2011). Handbook of Human Factors in Web Design. Boca Raton, FL: CRC Press.
  • Winfield, L; Glassmire, J; Colgate, J E and Peshkin, M (2007). T-PaD: Tactile pattern display through variable friction reduction. In: EuroHaptics Conference, 2007 and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. World Haptics 2007. Second Joint (pp. 421-426). IEEE.

Licensed under CC BY-SA 3.0 | Source: http://www.scholarpedia.org/article/Haptic_Displays
8 views | Status: cached on November 17 2021 02:22:32
↧ Download this article as ZWI file