PhD Defense by Syeda Narjis Fatima

Title: Continuous Emotion Recognition in Dyadic Interactions
Speaker: Syeda Narjis Fatima
Time: June 22, 2020, 11:30

Abstract:
Understanding emotional dynamics of dyadic interactions is crucial for developing more natural human-computer interaction (HCI) systems. Emotional dependencies and affect context play important roles in dyadic interactions. The emotional state of a participant is modulated in many communication channels such as speech, head and body motion, vocal activity patterns as well as non-verbal vocalizations of speech during dyadic interactions. Recent studies have shown that affect recognition tasks can benefit by the incorporation of a particular interaction’s context, however, particularly the investigation of the role and contribution of affect context and its incorporation into dyadic neural architectures remains a complex and open problem. Our work takes motivation from this perspective and therefore, in this thesis, a series of related studies targeting emotional dependencies during dyadic interactions are conducted to improve continuous emotion recognition (CER). Firstly, we define a convolutional neural network (CNN) architecture for single-subject CER based on speech and body motion data. We then introduce dyadic CER as a two-stage regression framework and explore ways in which cross-subject affect can be used to improve CER performance for a target subject. Specifically, we propose two dyadic CNN architectures where cross-subject contribution to the CER task is achieved by fusion of cross-subject affect and feature maps. As a conclusive work, we define dyadic affect context (DAC) and propose a new Convolutional LSTM (ConvLSTM) model that exploits it for dyadic CER. Our ConvLSTM model captures local spectro-temporal correlations in speech and body motion as well as the long-term affect inter-dependencies between subjects. Our multimodal analysis demonstrates that modeling and incorporation of the DAC in the proposed CER models provide significant performance improvements on the USC CreativeIT database and the achieved results compare favorably to the state-of-the-art.

Leave a Reply

Your email address will not be published.

Time limit is exhausted. Please reload the CAPTCHA.