Haptic exploration

From Scholarpedia - Reading time: 9 min

Haptic exploration refers to purposive action patterns that perceivers execute in order to encode properties of surfaces and objects, patterns that are executed spontaneously and also appear to optimize information uptake.

Contents

[edit] Exploratory procedures

In 1987, Lederman and Klatzky described a set of specialized patterns of exploration, called exploratory procedures (EPs). These exploratory patterns are linked to specific object properties, in two respects: The EP associated with an object property is (a) executed spontaneously when information about that property is desired, and (b) appears to optimize information uptake about that property. A basic set of EPs, along with their associated properties and behavioral invariants, is as follows:

  • Exploratory Procedure: Lateral motion
  • Associated Property: Surface texture
  • Behavior: The skin is passed laterally across a surface, producing shear force.
  • Exploratory Procedure: Pressure
  • Associated Property: Compliance or hardness
  • Behavior: Force is exerted on the object against a resisting force; for example, by pressing into the surface, bending the object, or twisting.
  • Exploratory Procedure: Static contact
  • Associated Property: Apparent temperature
  • Behavior: The skin surface is held in contact with the object surface, without motion; typically a large surface (like the whole hand) is applied. This EP gives rise to heat flow between the skin and the object.
  • Exploratory Procedure: Unsupported holding
  • Associated Property: Weight
  • Behavior: The object is held while the hand is not externally supported; typically this EP involves lifting, hefting or wielding the object.
  • Exploratory Procedure: Enclosure
  • Associated Property: Volume; Global shape
  • Behavior: The fingers (or other exploring effector) are molded closely to the object surface.
  • Exploratory Procedure: Contour following
  • Associated Property: Exact shape
  • Behavior: Skin contact follows the gradient of the object's surface or is maintained along edges when they are present.

People can acquire accurate information about the touched environment not only by directed exploration, but by dynamic touch in the act of manipulating tools and other objects. Turvey and colleagues focused on information that can be obtained from grasping and wielding (e.g., raising, lowering, pushing, turning, transporting: (Turvey & Carello, 1995)). Information obtained about an object from wielding includes length, weight, width, shape of the object tip, and orientations of hands in relation to the object. This information is obtained from the sensitivity of body tissues to values of rotational dynamics under rotational forces (torques) and motions.

[edit] Optimization of Exploratory Procedures

For purposes of haptic perception, people tend to explore objects using EPs that optimize information apprehension (Klatzky & Lederman, 1987). An EP that is optimal for one property may also deliver information about another; for example, contour following on a surface will be informative about its texture, since the contour following EP, like lateral motion, produces shear forces. Conversely, although a specialized EP maximizes information intake about an associated object property, it has the further consequence of limiting access to some other properties. For example, use of the static hand to perceive temperature is incompatible with executing lateral motion to perceive texture. Some EPs can be executed together, allowing simultaneous access to multiple object properties (Klatzky, Lederman, & Reed, 1987). For example, people tend to exert pressure while they are rubbing an object to obtain simultaneous compliance and texture information. Also, when people grasp and lift an object, they obtain information about its shape, volume, and weight.

Fine-grained analysis of exploration has found that the parameterization of EPS is optimized to the local context (Riley et al., 2002; Smith et al., 2002). When judging roughness, for example, people vary contact force more with smooth than rough exemplars, and they scan more rapidly when discriminating surfaces than identifying them. (Tanaka, Bergmann Tiest, Kappers, & Sano, 2014). When exploring to determine compliance, people use greater force when expecting a rigid object than a compliant on. The optimality of this approach is indicated by the finding that when fine compliance discriminations are called for, enforcing unnaturally low forces impairs performance (Kaim & Drewing, 2011).

The presence of vision changes the nature of optimal haptic exploration. When objects can be viewed as well as touched, specialized EPs tend to be executed only when people wish to perceive material properties (e.g., compliance, texture), and then only when relatively precise discrimination is demanded (Klatzky, Lederman & Matula, 1993). For example, the rough texture of coarse sandpaper is salient to vision, and hence the object is unlikely to elicit haptic exploration. In contrast, a person attempting to determine whether a surface is free from grit (a fine discrimination) will be likely to explore using a specialized EP, lateral motion.

[edit] Haptic Exploration in Nonhumans and Young Humans

A variety of species exhibit specialized patterns of exploration similar to those of humans. These include squirrel and capuchin monkeys (Hille, Becker-Carus, Dücker, & Dehnhardt, 2001; Lacreuse & Fragaszy, 1997). Whisker sweeps of seals during shape discrimination may serve a similar function to following contours of an object by humans (Denhardt & Dücker, 1996).

In humans, systematic exploration similar to that of adults is observed in young children and follows a developmental progression. Early grasping and fingering may be precursors of the EPs of enclosure and lateral motion (Ruff, 1989). The occurrence of these EPs has been found to be predicated on which properties are most salient; for example, textured objects promote fingering. The fact that infants tend to explore with the mouth may reflect early motor control of the oral musculature as well as the density of sensory input (Rochat, 1989).

Pre-school aged children execute adult-like EPs when given similar perceptual goals (Kalagher & Jones, 2011). Pre-schoolers spontaneously show dedicated exploratory patterns not only when a target dimension such as hardness is explicitly mentioned, but when its appropriateness arises in tool use. For example, they use pressure to test a stirring stick to ensure that it is sufficiently rigid for the substance that must be stirred (Klatzky, Lederman & Mankinen, 2005). Appropriate exploration is also found when blind children match objects on designated dimensions (Withagen, Kappers, Vervloed, Knoors, & Verhoeven, 2013).

[edit] Haptic Exploration from a Neurophysiological Perspective

To understand why there is specialization of exploration during haptic perception, one must consider two perspectives: the physical interaction between the perceiver and the object, and the neural consequences of that interaction. In general, EPs put into place physical interactions that then optimize the signal to sensory receptors and higher-order neural computations. For example, the EP called Static Contact, whereby a large skin surface is held without motion against a surface in order to perceive its temperature, provides an opportunity for heat flow between the skin and surface. The resulting change of temperature at the skin surface is sensed by neurons within the skin that specialize in the detection of coolness and warmth, and initiate signals of thermal change to the brain (Jones & Ho, 2008). In contrast, dynamic contact, such as grasping and releasing a warming or cooling surface, actually diminishes the thermal sensation relative to the static EP (Green, 2009).

As was noted above, fine-grained parameters of exploration are tuned to the perceptual and task environment. Such tuning appears to be the result of pre-cortical as well as cortical neural interactions. When Weiss and Flanders (2011) asked participants to follow, with their fingertip, the contour of a spherical surface rendered in a virtual environment, an unexpected change in surface curvature led to a compensatory adjustment in contact force with a latency associated with spinal control (approximately 50 ms). Other control mechanisms appeared to be regulated at cortical levels.

Specialized exploration has implications for the brain's ability to recognize objects by touch. For tactile object recognition, perceptual input information must be transformed through a series of processing stages before object identification. The first stage of processing involves the extraction and processing of the properties and features of the felt object. For example, to recognize a quarter one takes in its small size, cool temperature, round shape, and rough edges. The perceptual information provided by EPs is combined or integrated into a common modality-specific object description, which is then used to access information about object identity and function. When people are allowed to either use specialized EPs or a more general grasp for recognizing real, multidimensional objects, brain regions specific to touch are activated in addition to brain regions activated for visual object recognition. These touch-specific brain regions include primary somatosensory cortex (SI), secondary somatosensory cortex (SII), parietal operculum, and the insula. Some activated brain regions are similar to object recognition using vision, such as the lateral occipitotemporal cortex, medial temporal lobe, and prefrontal areas, all of which support cross-modal information integration leading to object recognition (Reed, Shoham, & Halgren, 2004). However, when the touched stimuli are primarily two-dimensional spatial patterns that do not permit EPs, a greater reliance on visual cortical regions (e.g., V1, V2, etc.) is observed (Zangaladze, Epstein, Grafton, & Sathian, 1999).

To what extent are EPs necessary for object recognition? Patients with brain damage provide some insight into the relative contributions from sensory and motor inputs. To examine the extent to which somatosensory and motoric inputs influence the tactile object recognition process, researchers have demonstrated that patients who have damage to their hands or peripheral nervous system, as well as patients with lesions in SI, do not always show tactile object recognition deficits (Valenza, et al., 2001). Likewise, patients with hand paralysis do not necessarily have significant deficits in tactile object recognition (Caselli, 1991). Because the hand uses both tactile inputs and hand movements, usually in the form of EPs, to extract somatosensory information, these patients may be using the motions of object parts or hand movement cues to extract relevant object information that they cannot obtain through purely tactile perception. Finally, patients with “tactile agnosia” have a deficit in recognizing common objects by touch following brain damage. Although rare, these patients tend to have lesions in the left inferior parietal lobe (e.g., Reed, Caselli, & Farah, 1996). Their disorder tends to be at a higher cognitive level because their EPs are normal and they tend to have relatively intact tactile sensation, memory, spatial processing, and general intellectual function. Nonetheless, tactile object recognition deficits are observed when patients have patients with right hemisphere parietal lobe damage and demonstrate disorganized or random EPs. In sum, the inability to execute EPs does contribute to tactile agnosia and make object recognition more difficult, but the object recognition process can be aided by previous knowledge of object parts and functions that can assist sensory and motoric limitations.

[edit] Haptic Exploration and Tool Use

Finally, people explore objects not only with their hands or other parts of the body, but also with tools. Although tools limit what can be perceived, they can nevertheless provide at least coarse information about some object properties. For example, when a rigid tool is used to explore a rigid object, the resulting vibrations can enable people to perceive its surface texture (Katz, 1925). Any dentist knows that the deformation and resistance of an object under the force of a probing tool can allow the perception of compliance. The same EP that is executed with the bare hand to extract an object property may not be observed when a tool is used, given the changes in the physical interaction that arise from tool use.

[edit] Conclusion

In conclusion, haptic exploration involves exploratory procedures, active touch patterns that are specific to the demands of the task in that they optimize the extraction of information the perceiver needs to obtain. Purpose haptic exploration allows people and animals to extract specific types of tactile information from the environment and provides information about the material or substance properties of objects that cannot be achieved by vision alone. Further, exploratory procedures expand the basic sensory functions of animate bodies by allowing them to perceive the world through tools.

[edit] References

  • Denhardt, G and Dücker, G (1996). Tactual discrimination of size and shape by a California sea lion (Zalophus californianus). Animal Learning & Behavior 24(4): 366-374. doi:10.3758/bf03199008.
  • Green, B G (2009). Temperature perception on the hand during static versus dynamic contact with a surface. Attention, Perception, & Psychophysics 71(5): 1185–1196. doi:10.3758/app.71.5.1185.
  • Hille, P; Becker-Carus, C; Dücker, G and Dehnhardt, G (2001). Haptic discrimination of size and texture in squirrel monkeys (Saimiri sciureus). Somatosensory & Motor Research 18(1): 50-61. doi:10.1080/08990220020021348.
  • Jones, L A and Ho, H (2008). Warm or cool, large or small? The challenge of thermal displays. IEEE Transactions on Haptics 1(1): 53-70. doi:10.1109/toh.2008.2.
  • Kaim, L and Drewing, K (2011). Exploratory strategies in haptic softness discrimination are tuned to achieve high levels of task performance. IEEE Transactions on Haptics 4(4): 242-252. doi:10.1109/toh.2011.19.
  • Kalagher, H and Jones, S S (2011). Young children's haptic exploratory procedures. Journal of Experimental Child Psychology 110(4): 592-602. doi:10.1016/j.jecp.2011.06.007.
  • Katz, D (1925). Der aufbau der tastwelt (The world of touch). L Krueger and L Erlbaum (Eds.). Mahwah , NJ. ISBN: 9780805805291.
  • Klatzky, R L; Lederman, S J and Mankinen, J M (2005). Visual and haptic exploratory procedures in children's judgments about tool function. Infant Behavior and Development 28(3): 240-249. doi:10.1016/j.infbeh.2005.05.002.
  • Klatzky, R L; Lederman, S J and Matula, D E (1993). Haptic exploration in the presence of vision. Journal of Experimental Psychology: Human Perception and Performance 19(4): 726-743. doi:10.1037//0096-1523.19.4.726.
  • Klatzky, R L; Lederman, S J and Reed, C L (1987). There's more to touch than meets the eye: the salience of object attributes for hpatics with and without vision. Journal of Experimental Psychology: General 116: 356-369.
  • Lacreuse, A and Fragaszy, D M (1997). Manual exploratory procedures and asymmetries for a haptic search task: A comparison between capuchins (Cebus apella) and humans. Laterality 2(3): 247-266. doi:10.1080/135765097397477.
  • Reed, C L; Caselli, R J and Farah, M J (1996). Tactile agnosia: Underlying impairment and implications for normal tactile object recognition. Brain 119(3): 875-888. doi:10.1093/brain/119.3.875.
  • Reed, C L; Shoham, S and Halgren, E (2004). Neural substrates of tactile object recognition: An fMRI study. Human Brain Mapping 21(4): 236-246. doi:10.1002/hbm.10162.
  • Riley, M A; Wagman, J B; Santana, M-V; Carello, C and Turvey, M T (2002). Perceptual behavior: Recurrence analysis of a haptic exploratory procedure. Perception 31(4): 481-510. doi:10.1068/p3176.
  • Ruff, H A (1989). The infant's use of visual and haptic information in the perception and recognition of objects. Canadian Journal of Psychology/Revue Canadienne De Psychologie 43(2): 302-319. doi:10.1037/h0084222.
  • Smith, A; Gosselin, G and Houde, B (2002). Deployment of fingertip forces in tactile exploration. Experimental Brain Research 147(2): 209–218. doi:10.1007/s00221-002-1240-4.
  • Tanaka, Y; Bergmann Tiest, W M; Kappers, A M L and Sano, A (2014). Contact force and scanning velocity during active roughness perception. PLoS ONE 9(3): e93363. doi:10.1371/journal.pone.0093363.
  • Turvey, M T and Carello, C (1995). Dynamic touch. In: W Epstein and S Rogers (Eds.), Handbook of perception and cognition, Vol. 5, Perception of space and motion (pp. 401-490). San Diego: Academic Press.
  • Valenza, N et al. (2001). Dissociated active and passive tactile shape recognition: A case study of pure tactile apraxia. Brain 124(11): 2287-2298. doi:10.1093/brain/124.11.2287.
  • Weiss, E J and Flanders, M (2011). Somatosensory comparison during haptic tracing. Cerebral Cortex 21(2): 425–434. doi:10.1093/cercor/bhq110.
  • Withagen, A; Kappers, A M L; Vervloed, M P J; Knoors, H and Verhoeven, L (2013). The use of exploratory procedures by blind and sighted adults and children. Attention, Perception, & Psychophysics 75(7): 1451-1464. doi:10.3758/s13414-013-0479-0.
  • Zangaladze, A; Epstein, C M; Grafton, S T and Sathian, K (1999). Involvement of visual cortex in tactile discrimination of orientation. Nature 401(6753): 587-590. doi:10.1038/44139.


Internal references

[edit] Further reading

[edit] External links

[edit] See also


Licensed under CC BY-SA 3.0 | Source: http://www.scholarpedia.org/article/Haptic_exploration
10 views | Status: cached on November 17 2021 02:22:34
↧ Download this article as ZWI file