Date of Award

1-1-2015

Document Type

Dissertation

Degree Name

Ph.D.

Organizational Unit

Daniel Felix Ritchie School of Engineering and Computer Science

First Advisor

Mohammad H. Mahoor, Ph.D.

Second Advisor

Bruce Pennington

Third Advisor

Kimon Valavanis

Fourth Advisor

Matthew Rutherford

Keywords

Autism Spectrum Disorder (ASD), Autism therapy setting, Dynamic Actions Units, Facial expression, Human Machine Interaction (HMI), Spontaneous facial behavior

Abstract

Digital devices and computing machines such as computers, hand-held devices and robots are becoming an important part of our daily life. To have affect-aware intelligent Human-Machine Interaction (HMI) systems, scientists and engineers have aimed to design interfaces which can emulate face-to-face communication. Such HMI systems are capable of detecting and responding upon users' emotions and affective states. One of the main challenges for producing such intelligent system is to design a machine, which can automatically compute spontaneous behaviors of humans in real-life settings. Since humans' facial behaviors contain important non-verbal cues, this dissertation studies facial actions and behaviors in HMI systems. The main two objectives of this dissertation are: 1- capturing, annotating and computing spontaneous facial expressions in a Human-Computer Interaction (HCI) system and releasing a database that allows researchers to study the dynamics of facial muscle movements in both posed and spontaneous data. 2- developing and deploying a robot-based intervention protocol for autism therapeutic applications and modeling facial behaviors of children with high-functioning autism in a real-world Human-Robot Interaction (HRI) system.

Because of the lack of data for analyzing the dynamics of spontaneous facial expressions, my colleagues and I introduced and released a novel database called "Denver Intensity of Spontaneous Facial Actions (DISFA)" . DISFA describes facial expressions using Facial Action Coding System (FACS) - a gold standard technique which annotates facial muscle movements in terms of a set of defined Action Units (AUs). This dissertation also introduces an automated system for recognizing DISFA's facial expressions and dynamics of AUs in a single image or sequence of facial images. Results illustrate that our automated system is capable of computing AU dynamics with high accuracy (overall reliability ICC = 0.77). In addition, this dissertation investigates and computes the dynamics and temporal patterns of both spontaneous and posed facial actions, which can be used to automatically infer the meaning of facial expressions.

Another objective of this dissertation is to analyze and compute facial behaviors (i.e. eye gaze and head orientation) of individuals in real-world HRI system. Due to the fact that children with Autism Spectrum Disorder (ASD) show interest toward technology, we designed and conducted a set of robot-based games to study and foster the socio-behavioral responses of children diagnosed with high-functioning ASD. Computing the gaze direction and head orientation patterns illustrate how individuals with ASD regulate their facial behaviors differently (compared to typically developing children) when interacting with a robot. In addition, studying the behavioral responses of participants during different phases of this study (i.e. baseline, intervention and follow-up) reveals that overall, a robot-based therapy setting can be a viable approach for helping individuals with autism.

Publication Statement

Copyright is held by the author. User is responsible for all copyright compliance.

Rights Holder

Seyedmohammad Mavadati

Provenance

Received from ProQuest

File Format

application/pdf

Language

en

File Size

166 p.

Discipline

Electrical engineering, Computer engineering, Psychobiology



Share

COinS