Multisensory processing in autism

  • Awarded: 2012
  • Award Type: Explorer
  • Award #: 247992

In daily life, people often experience activation of multiple sensory systems at the same time. For example, speech perception and social behavior rely on an interaction between the auditory and visual systems (i.e., listening to a person’s voice while watching his lips move). Similarly, as people interact with the environment, signals from the visual and balance (vestibular) systems must work together. This process is known as ‘multisensory integration,’ and when it’s not functioning well, as in autism, life becomes challenging.

There is abundant evidence pointing to deficits in sensory and multisensory processing in autism. However, the nature of these multisensory deficits has not been thoroughly tested. Dora Angelaki and her colleagues at Baylor College of Medicine in Houston, Texas, aim to test this idea directly. The researchers’ experimental approach, which has been used in healthy people and in non-human primates, has proven to be an invaluable tool for understanding multisensory integration. Experiments are performed in a multisensory motion system, in which the participants experience real as well as virtual motion. While in this motion system, most people integrate their senses according to a particular strategy that leads to reliable perception. Angelaki and her team plan to analyze and characterize how individuals with autism integrate (or don’t integrate) their senses while in the motion system, and then compare these findings with those from healthy controls.

Using this approach, Angelaki and her team aim to advance our understanding of and ability to diagnose autism. What’s more, by probing the neural basis of multisensory processing, their results may provide future directions for autism treatment.

Subscribe to our newsletter and receive SFARI funding announcements and news