Dissecting social attention in autism using large-sample eye tracking over the Internet

  • Awarded: 2022
  • Award Type: Human Cognitive and Behavioral Science Award
  • Award #: 990500

Among the multiple cognitive processes that are atypical in people with autism, attention stands out as one of the most important. Adults with autism show clearly atypical visual attention, as documented in a large number of studies that measured eye tracking across simple tasks, complex images and videos. Infants with autism are likely to have atypical attention that may cause or magnify subsequent difficulties throughout development.

There is a wealth of evidence that attention is a core dysfunctional domain in autism. One reason for the strong evidence is that it is relatively straightforward to quantify attention using eye tracking. Yet two large limitations loom in this literature: subject sample sizes are generally small, and there is little or no estimate of within-subject variability over time. Both of these limitations are critical to address. Large samples are required to understand how phenotypes are associated with genetic variability and to discover possible subtypes of autism. Within-subject variability is essential to measure in order to distinguish it from between-subject variability, and also as a metric of interest in its own right. A broad finding in the literature is that people with autism show more variability on tasks than do typically developed individuals: but in general, it is not known whether this arises from increased heterogeneity across people with autism (but with stable profiles within an individual) or simply from less reliable behavior within individuals (that is, they are noisier within-subject).

In this project, Ralph Adolphs and colleagues aim to tackle these two major limitations in the context of a new technology, webcam-based eye tracking over the Internet, focusing on the topic of social attention (attention to people, faces and their features). His lab will leverage the WebGazer platform1, which the research team has been developing and validating for the past year, as well as AI-powered eye tracking analyses, in order to collect webcam-based eye tracking data from three participant populations: (1) the lab’s own participant registry (which offers direct comparison to in-lab eye tracking), (2) participants recruited through a common online platform, Prolific (offering several thousand individuals with and without autism, but providing some challenges to assessment) and (3) SFARI’s own SPARK cohort (offering genetic associations). Taken together, sampling from these three populations will generate a unique and rich database of large-sample, longitudinal eye tracking data that will be shared with the research community.


1.Papoutsaki A. et al, Proc. IJCAI-16, 3839–3845 (2016) Article
Subscribe to our newsletter and receive SFARI funding announcements and news