Login (DCU Staff Only)
Login (DCU Staff Only)

DORAS | DCU Research Repository

Explore open access research and scholarly works from DCU

Advanced Search

Investigating multi-modal features for continuous affect recognition using visual sensing

Wei, Haolin (2018) Investigating multi-modal features for continuous affect recognition using visual sensing. PhD thesis, Dublin City University.

Abstract
Emotion plays an essential role in human cognition, perception and rational decisionmaking. In the information age, people spend more time then ever before interacting with computers, however current technologies such as Artificial Intelligence (AI) and Human-Computer Interaction (HCI) have largely ignored the implicit information of a user’s emotional state leading to an often frustrating and cold user experience. To bridge this gap between human and computer, the field of affective computing has become a popular research topic. Affective computing is an interdisciplinary field encompassing computer, social, cognitive, psychology and neural science. This thesis focuses on human affect recognition, which is one of the most commonly investigated areas in affective computing. Although from a psychology point of view, emotion is usually defined differently from affect, for this thesis the terms emotion, affect, emotional state and affective state are used interchangeably. Both visual and vocal cues have been used in previous research to recognise a human’s affective states. For visual cues, information from the face is often used. Although these systems achieved good performance under laboratory settings, it has proved a challenging task to translate these to unconstrained environments due to variations in head pose and lighting conditions. Since a human face is a threedimensional (3D) object whose 2D projection is sensitive to the aforementioned variations, recent trends have shifted towards using 3D facial information to improve the accuracy and robustness of the systems. However these systems are still focused on recognising deliberately displayed affective states, mainly prototypical expressions of six basic emotions (happiness, sadness, fear, anger, surprise and disgust). To our best knowledge, no research has been conducted towards continuous recognition of spontaneous affective states using 3D facial information. The main goal of this thesis is to investigate the use of 2D (colour) and 3D (depth) facial information to recognise spontaneous affective states continuously. Due to a lack of an existing continuous annotated spontaneous data set, which contains both colour and depth information, such a data set was created. To better understand the processes in affect recognition and to compare results of the proposed methods, a baseline system was implemented. Then the use of colour and depth information for affect recognition were examined separately. For colour information, an investigation was carried out to explore the performance of various state-of-art 2D facial features using different publicly available data sets as well as the captured data set. Experiments were also carried out to study if it is possible to predict a human’s affective state using 2D features extracted from individual facial parts (E.g. eyes and mouth). For depth information, a number of histogram based features were used and their performance was evaluated. Finally a multi-modal affect recognition framework utilising both colour and depth information is proposed and its performance was evaluated using the captured data set.
Metadata
Item Type:Thesis (PhD)
Date of Award:January 2018
Refereed:No
Supervisor(s):O'Connor, Noel E. and Monaghan, David S.
Subjects:Computer Science > Machine learning
Computer Science > Digital video
DCU Faculties and Centres:DCU Faculties and Schools > Faculty of Engineering and Computing > School of Electronic Engineering
Research Institutes and Centres > INSIGHT Centre for Data Analytics
Use License:This item is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 3.0 License. View License
Funders:Bell Labs Ireland, Irish Research Council under the Enterprise Partnership scheme., European Commission under the Contract FP7-ICT-287723 REVERIE, European Union’s Horizon 2020 Framework Programme under Grant Agreement no. 643491.
ID Code:22160
Deposited On:05 Apr 2018 11:46 by Noel Edward O'connor . Last Modified 19 Jul 2018 15:12
Documents

Full text available as:

[thumbnail of thesis_haolin_wei_final.pdf]
Preview
PDF - Requires a PDF viewer such as GSview, Xpdf or Adobe Acrobat Reader
3MB
Downloads

Downloads

Downloads per month over past year

Archive Staff Only: edit this record