Perception & Action Lab

Sensory cue integration

The brain has a multitude of sensory cues about the world to make perceptual inferences and to guide motor behavior. Vision itself contains a number of qualitatively different cues that are coded by at least partially distinct neural mechanisms (stereoscopic disparities, motion, shading, texture, contour, shadows, etc.). Other sensory modalities provide independent sources of information as well. Touch can provide information about an object's shape, orientation and position in space, kinesthetic cues provide information about limb and body movements, audition provides information about an object's position, motion and material makeup.

We are studying how the brain integrates different cues, both within the visual modality and between senses. Most recently we have begun working on how the brain integrates non-visual signals (e.g. audition, kinesthesis) with visual signals to perceive the world. One of the striking phenomena in this realm is the ability of one sense changing one's perception of another sense, as in the famous McGurk effect in which vision of someone's lip movements can dramatically change what you think you hear. We have shown that felt hand movements can similarly dramatically change your visual percept of a pattern's motion. We use both theoretical and psychophysical studies to unravel the computations that underlay these sensory interactions.