An important principle of sensory information processing is illustrated by touch receptors. The brain analyses sensory information by deconstructing the stimulus into component parts and sensing local features such as surface curvature or edges. For example, each of the dots that comprise a letter of the Braille alphabet is read by a different set of touch receptors in the fingertip. The shape of the entire letter is thus distributed across the population of receptors – bursts of impulses are transmitted in parallel by primary afferent fibers touching each dot, while the other is silent. The signals are brought together by central processing networks in the brain to reconstruct the complete pattern of dots and perceive it as a single character. This mechanism requires an orderly arrangement of somatosensory neurons such that neighborhood relations on the body are preserved in the brain. In addition, since sensory information is transmitted to the brain as sequences of nerve impulses, the different submodalities, and receptor types must remain segregated so that responses to pressure are not confounded with those signaling motion or temperature.
The sensory information detected by touch receptors is conveyed to the central nervous system along the peripheral nerves together with nerve fibers subserving other somatosensory modalities in the same body segment such as pain, temperature, and proprioception. Touch fibers enter the spinal cord through the dorsal roots and ascend through the dorsal columns together with fibers for proprioception to the medulla where they terminate in the dorsal column nuclei (the cuneate and gracile nuclei). The second-order neurons in the dorsal column nuclei send their axons across the midline in the medulla where they ascend in the medial lemniscus to the ventral posterior lateral (VPL) and medial (VPM) nuclei of the thalamus.
The ascending somatosensory pathways and processing centers are organized along two orthogonal axes: one topographic and the other functional. One axis displays the topography of the body in what is called somatotopic maps. The somatotopic map of the body is preserved in all the somatosensory areas of the brain, although the details of the body orientation and receptive field topography differ in each representation. The other axis segregates the various somatosensory modalities into functional groups of neurons. Thus receptors for touch and proprioception are grouped into distinct anatomical fascicles and columns of cells. Eventually, the modalities converge on to common neurons. This convergence occurs at the highest levels of cortical processing involved in cognition and motor planning, at spinal interneurons involved in reflex pathways, and at the motor neurons whose firing patterns govern all behavior. The somatosensory nuclei of the brainstem and thalamus use convergence of sensory afferents to bring together sensory information from neighboring skin regions. These inputs mutually reinforce each other, providing the first step in object representation. For example, inputs from groups of receptors aligned along an edge that is stimulated simultaneously will be enhanced by convergence, whereas those aligned across the edge will be less effective because only a few receptors are activated.
In addition, inhibitory interneurons in these nuclei suppress weakly stimulated neurons, thereby sharpening the outputs from the most active groups of mechanoreceptors so that the strongest signals are relayed forward. The inhibitory networks also filter noise from random neural activity. Thus the signal transmitted to the cerebral cortex preserves the accurate spatial and intensive information encoded by mechanoreceptors while integrating these signals to enhance feature recognition. Finally, higher centers in the brain, such as the cerebral cortex, use the inhibitory networks in the brainstem and thalamic nuclei to modulate the sensory information transmitted from the skin. These descending inhibitory connections provide contextual information about the immediate behavioral significance of input from touch receptors needed to enhance or suppress it.
Primary somatosensory cortex
Tactile information reaches the conscious mind when it enters the cerebral cortex. Thalamic information is conveyed initially to the primary somatosensory cortex (S-I) located in the postcentral gyrus of the parietal lobe. S-I cortex spans four cytoarchitectural areas that are arrayed as parallel stripes along the rostral-caudal axis of the parietal lobe (Figure). The four areas of the S-I cortex are extensively interconnected, such that both serial and parallel processing networks are engaged in the higher-order elaboration of information from the sense of touch.
The four areas differ in anatomical connectivity and function. Thalamic fibers from VPL and VPM terminate in areas 3a and 3b, and the cells in areas 3a and 3b project their axons to areas 1 and 2, respectively. Areas 3b and 1 receive information from receptors in the skin, whereas areas 3a and 2 receive proprioceptive information from receptors in muscles, joints, and the skin. This information is conveyed in parallel from the four areas of the S-I cortex to higher centers in the cortex, including the second somatosensory (S-II) cortex, the posterior parietal cortex, and the primary motor (M-I) cortex.
Each cortical neuron receives inputs arising from receptors in a specific area of the skin, and these inputs together are its receptive field. We perceive that a particular location on the skin is touched because a specific population of neurons in the brain is activated. Conversely, when a point in the cortex is stimulated electrically we experience tactile sensations on a specific part of the skin.
The receptive fields of cortical neurons are much larger than the receptive fields of touch fibers in peripheral nerves. For example, the receptive fields of SA1 and RA1 fibers innervating the fingertip are tiny spots on the skin, whereas those of the cortical neurons receiving these inputs are large areas covering the entire fingertip. The receptive field of a neuron in area 3b represents a composite of inputs from 300 to 400 touch fibers innervating neighboring areas of the skin on the opposite (contralateral) side of the body. An individual neuron in area 3b resolves fine details of spatial patterns, such as an array of Braille dots, by faithfully reproducing the activity of the receptors that provide the strongest input. As in the periphery, complex spatial patterns are encoded in area 3b by bursts and silences distributed across a population of topographically arranged neurons.
Neurons at the next stage of cortical processing, in areas 1 and 2, integrate information from large groups of receptors. Receptive fields in these areas are larger than in area 3b, spanning functional regions of skin that are activated simultaneously during motor activity. These include the tips of several adjacent fingers, or both the fingers and the palm. Their responses are less tightly linked to the actual location of stimuli on the skin. Instead, specific combinations of sensory inputs are required for optimum activation of these cells. Their firing patterns are tuned to features such as the orientation of edges, the spacing of repeated patterns in gratings or Braille dot arrays, the surface curvature, the direction of motion across the skin, or the integrated posture of the hand and arm (Figure).
These neurons signal properties common to a variety of shapes such as vertical or horizontal edges, rather than their exact location on the body. Feature detection is a property of cortical processing common to a variety of sensory systems including touch. The higher cortical areas assemble the components detected by the receptors into a coherent representation of the entire object by requiring specific spatiotemporal conjunctions of sensory inputs. Convergent excitatory connections between neurons representing neighboring skin areas and intracortical inhibitory circuits enable higher-order cortical cells to integrate global features of objects to detect their size, shape, weight, and texture. Although most neurons in areas 3b and 1 respond only to touch, and neurons in areas 3a respond to muscle stretch, many of the neurons in areas 2 receive both inputs. This convergence of modalities allows neurons in area 2 to integrate the hand posture used to grasp an object, the grip force applied by the hand, and the tactile stimulation produced by the object that allows us to recognize it.
In this manner, the somatosensory areas of the brain represent properties common to particular classes of objects. However, it would be a mistake to assume that each object that is handled becomes imprinted on a single neuron at the apex of cortical processing. Although the mechanisms underlying the binding of features that give rise to a unified percept are not fully understood, it is believed that temporal synchrony between different cortical areas plays an important role in this process. This mechanism permits the integration of the detailed representation of spatial properties at the early stages with the more abstract representations further along with the anatomical network.
Higher-order somatosensory areas of the cerebral cortex
Neuronal responses to touch in the S-I cortex depend almost exclusively on input from within the neuron’s receptive field. This feed-forward pathway is often described as a ‘bottom-up’ process because the receptors in the hand are the principal source of excitation of S-I neurons. Higher-order somatosensory areas of the parietal lobe not only receive information from peripheral receptors but are also strongly influenced by ‘top-down’ processes, such as behavioral goals, attentional modulation, and working memory. Data obtained from single-neuron studies in monkeys, from neuroimaging studies in humans, and clinical observations of patients with lesions in higher-order somatosensory areas of the brain suggest that the ventral and dorsal regions of the parietal lobe serve complementary functions in the sense of touch similar to the what’ and ‘where’ pathways of the visual system. The ventral pathway originates in the second somatosensory cortex (S-II cortex), located on the upper bank and adjacent to the parietal operculum of the lateral fissure. It plays an important sensory role in tactile object recognition, as selective attention increases neuronal responses to specific shapes. Although neurons in S-II respond to textures such as Braille dots, embossed letters, or periodic gratings, they do not replicate the spatial or temporal patterns of these stimuli in their spike trains. Instead, they fire at different rates for each pattern.
Moreover, the context in which tactile stimuli are presented influences the responses of neurons in the S-II cortex. Firing patterns of these neurons are modified by the behavioral relevance of the tactile information, or memories of the preceding stimuli, suggesting that the S-II cortex may be a decision point for tactile memory formation. This is consistent with its anatomical connections to the insular cortex, which in turn innervates regions of the temporal lobe that are important for tactile memory. This somatosensory pathway for tactual form has a parallel function to the visual pathway for form recognition through the inferotemporal cortex.
The dorsal pathway in the parietal lobe plays a sensorimotor role in the guidance of movement. The sense of touch is extremely important for skilled use of the hand. When tactile sensations are lost, due to nerve injury or to local anesthesia, hand movements are clumsy, poorly coordinated, and utilize abnormally high forces when grasping objects. Without touch one is completely reliant on vision for directing the hand. Tactile information from the skin is transmitted to the motor areas of the frontal lobe through direct pathways from S-I to the motor cortex. Touch is also communicated to the frontal lobe through a higher-order pathway that involves somatosensory connections to regions of the posterior parietal cortex surrounding the intraparietal sulcus: areas 5 and 7 in monkeys and the superior (SPL, Brodmann areas 5 and 7) and inferior parietal lobules (IPL, areas 39 and 40) in humans.
Tactile information from the skin is integrated into area 5 with postural inputs from the underlying muscles and joints to define the position and action of the hand. Neurons in the SPL respond vigorously when a monkey reaches out and shapes the hand in anticipation of grasping an object. These responses peak when the object is acquired in the hand thereby integrating tactile and postural information from the hand.
IPL neurons integrate tactile and visual stimuli conjoining the feel of objects with their appearance and location in space. Their firing patterns are correlated with the hand posture used to grasp an object rather than its geometric shape. The multimodal information encoded in the posterior parietal cortex is transmitted to the premotor areas of the frontal lobe that formulate complex movement sequences such as specific grasp styles. These networks thus provide feedback from the senses of touch, proprioception, and vision that can modify the behaviors used to handle objects.
Nervous Control of Movement Predicting the sensory consequences of hand actions is an important component of active touch. For example, when we view an object and reach for it, we predict how heavy it should be and how it should feel in the hand; we use such predictions to initiate grasping. During active touch, the motor system may control the afferent flow of somatosensory information so that subjects can predict when tactile information should arrive in the S-I cortex and be perceived in the conscious mind. The convergence of central and peripheral signals allows neurons to compare prediction and reality. Corollary discharge from the motor areas of the cortex to somatosensory regions may play a key role in active touch. It provides a neural signal of intended actions to posterior parietal areas allowing these neurons to compare predicted and actual neural responses to tactile stimuli. Such mechanisms may explain why it is so difficult to tickle oneself.