Publication Date

2014

Document Type

Conference Abstract

Abstract

Background:

Typical developing individuals utilize the direction of eye gaze and eye fixation/shifting as crucial elements to transmit socially relevant information (e.g. like, dislike) to others. Individuals with Autism Spectrum Disorder (ASD), deviant pattern of mutual eye gaze is a noticeable feature that may be one of the earliest (detectable) demonstrations of impaired social skills that would lead to other deficits in ASD Individuals (e.g. delaying development of social cognition and affective construal processes). This can significantly affect the quality of human’s social interactions. Recent studies reveal that children with ASD have superior engagement to the robot-based interaction, and it can effectively trigger positive behaviors (e.g. eye gaze attention). This suggests that interacting with robots may be a promising intervention approach for children with ASD.

Objectives: The main objective of this multidisciplinary research is to utilize humanoid robot technology along with psychological and engineering sciences to better improve the social skills of children with High Functioning Autism (HFA). The designed intervention protocol focuses on different skillsets, such as eye gaze attention, joint attention, facial expression recognition and imitation. The current study is designed to evaluate the eye gaze patterns of children with ASD during verbal communication with a humanoid robot.

Methods: Participants in this study are 13 male children ages 7-17 (M=11 years) diagnosed with ASD. The study employs NAO, an autonomous, programmable humanoid robot from Aldebaran Robotics to interact with ASD children in a series of conversations and interactive games across 3 sessions. During different game segments, NAO and children exchange stories and having conversation on different context. During every session of the game, four cameras which were installed in the video capturing room in addition to the NAO’s front-facing camera record the entire interaction. Videos were later score to analyze the gaze patterns of the children for two different context. Studying eye gaze fixation and eye gaze shifting while: 1) NAO is talking, 2) Kid is talking.

Results: In order to analyze the eye gaze of participants, every frame of video was manually coded as Gaze Averted(‘0’) or Gaze At(‘1’) w.r.t NAO. To accurately analysis the gaze patterns of children during the conversation, the video segments of ‘NAO Talking’ and ‘Kid Talking’ have been selected. The averages of four measures were employed to report the static and dynamic properties of eye gaze patterns:

1) ‘NAO talking’: Gaze At NAO (GAN)= %55.3, Gaze Shifting (GS) =%3.4, GAN/GS = 34.10, Entropy GS: 0.20

2) ‘Kid talking’: GAN = %43.8, GS=%4.2, GAN/GS = 11.6, Entropy GS = 0.27

Conclusions:

The results indicates that the children with ASD having more eye contact and less gaze shifting while NAO is talking (Higher GAN/GS and lower Entropy GS), however they prefer to shift their gaze more often and have less fixation on the robot as they are speaking. These results will serve as an important basis to significantly advance the emerging field of robot-assisted therapy for children with ASD.

Copyright Statement / License for Reuse

Creative Commons Attribution 4.0 International License
This work is licensed under a Creative Commons Attribution 4.0 International License.

Comments

Abstract of a paper presented at the International Meeting for Autism Research, May 14-17, 2014 - Atlanta, GA, USA.

Full conference proceedings may be found at: https://insar.confex.com/imfar/2014/webprogram/start.html



Share

COinS