Perception & Action Lab

Sensorimotor control

While understanding the processes that lead to our conscious perception of the world is important and interesting, perhaps more fundamental to human behavior in the natural world are the processes by which the brain uses visual information to guide motor behavior. These processes occur "under the radar" of consciousness and in some cases appear to be distinct from those that determine our conscious percepts – neuropsychological dissociations between vision for perception and vision for action abound in the literature.

We use naturalistic visuo-motor tasks in virtual reality systems to study how humans use visual information to guide motor behavior. Rather than asking subjects what they see, we track their hand movements during visually guided tasks to make inferences about how the brain is using visual information to guide the hand. We have, for example, shown that humans automatically and continuously use visual feedback to guide reaching movements despite any conscious awareness of doing so. For example, we make automatic online corrections to fast hand movements in response to artificially induced visual error signals despite being unaware of those signals.

We work to understand the computations underlying this behavior by comparing human performance to that optimal control models that use visuo-motor strategies optimized to perform efficiently under the constraints of image and motor noise.

We are currently extending this work to studying visuomotor learning and how the brain learns visuomotor tasks.