Autistic young people adaptively use gaze to facilitate joint attention during multi-gestural dyadic interactions

Caruana, Nathan, Nalepka, Patrick, Perez, Glicyr A., Inkley, Christine, Munro, Courtney, Rapaport, Hannah, Brett, Simon ORCID:, Kaplan, David M., Richardson, Michael J. and Pellicano, Elizabeth (2023) Autistic young people adaptively use gaze to facilitate joint attention during multi-gestural dyadic interactions. Autism. ISSN 1362-3613

[thumbnail of Caruana_etal_2023_Autism]
PDF (Caruana_etal_2023_Autism) - Published Version
Available under License Creative Commons Attribution Non-commercial.

Download (2MB) | Preview


Autistic people often experience difficulties navigating face-to-face social interactions. Historically, the empirical literature has characterised these difficulties as cognitive ‘deficits’ in social information processing. However, the empirical basis for such claims is lacking, with most studies failing to capture the complexity of social interactions, often distilling them into singular communicative modalities (e.g. gaze-based communication) that are rarely used in isolation in daily interactions. The current study examined how gaze was used in concert with communicative hand gestures during joint attention interactions. We employed an immersive virtual reality paradigm, where autistic (n = 22) and non-autistic (n = 22) young people completed a collaborative task with a non-autistic confederate. Integrated eye-, head- and hand-motion-tracking enabled dyads to communicate naturally with each other while offering objective measures of attention and behaviour. Autistic people in our sample were similarly, if not more, effective in responding to hand-cued joint attention bids compared with non-autistic people. Moreover, both autistic and non-autistic people demonstrated an ability to adaptively use gaze information to aid coordination. Our findings suggest that the intersecting fields of autism and social neuroscience research may have overstated the role of eye gaze during coordinated social interactions. Lay abstract: Autistic people have been said to have ‘problems’ with joint attention, that is, looking where someone else is looking. Past studies of joint attention have used tasks that require autistic people to continuously look at and respond to eye-gaze cues. But joint attention can also be done using other social cues, like pointing. This study looked at whether autistic and non-autistic young people use another person’s eye gaze during joint attention in a task that did not require them to look at their partner’s face. In the task, each participant worked together with their partner to find a computer-generated object in virtual reality. Sometimes the participant had to help guide their partner to the object, and other times, they followed their partner’s lead. Participants were told to point to guide one another but were not told to use eye gaze. Both autistic and non-autistic participants often looked at their partner’s face during joint attention interactions and were faster to respond to their partner’s hand-pointing when the partner also looked at the object before pointing. This shows that autistic people can and do use information from another person’s eyes, even when they don’t have to. It is possible that, by not forcing autistic young people to look at their partner’s face and eyes, they were better able to gather information from their partner’s face when needed, without being overwhelmed. This shows how important it is to design tasks that provide autistic people with opportunities to show what they can do.

Item Type: Article
Additional Information: Funding Information: The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This project was supported, in part, by Macquarie University Research Fellowships awarded to N.C. and P.N., a ‘CCD Legacy Grant’ from the ARC Centre of Excellence for Cognition and its Disorders (CE110001021), awarded to N.C., P.N., D.M.K., M.J.R. and E.P., Australia Research Council Future Fellowships awarded to M.J.R. (FT180100447) and E.P. (FT190100077). Publisher Copyright: © The Author(s) 2023.
Uncontrolled Keywords: eye contact,gaze,non-verbal communication,social interaction,virtual reality,developmental and educational psychology ,/dk/atira/pure/subjectarea/asjc/3200/3204
Faculty \ School: Faculty of Medicine and Health Sciences > Norwich Medical School
Related URLs:
Depositing User: LivePure Connector
Date Deposited: 03 Apr 2024 13:31
Last Modified: 03 Apr 2024 13:31
DOI: 10.1177/13623613231211967


Downloads per month over past year

Actions (login required)

View Item View Item