HUB



Predicting Categorical Emotions by Jointly Learning Primary and Secondary Emotions through Multitask Learning

Reza Lotfian and Carlos Busso

Abstract:

Detection of human emotions is an essential part of affect-aware human-computer interaction (HCI). In daily conversations, the preferred way of describing affects is by using categorical emotion labels (e.g., sad, anger, surprise). In categorical emotion classification, multiple descriptors (with different degrees of relevance) can be assigned to a sample. Perceptual evaluations have relied on primary and secondary emotions to capture the ambiguous nature of spontaneous recordings. Primary emotion is the most relevant category felt by the evaluator. Secondary emotions capture other emotional cues also conveyed in the stimulus. In most cases, the labels collected from the secondary emotions are discarded, since assigning a single class label to a sample is preferred from an application perspective. In this work, we take advantage of both types of annotations to improve the performance of emotion classification. We collect the labels from all the annotations available for a sample and generate primary and secondary emotion labels. A classifier is then trained using multitask learning with both primary and secondary emotions. We experimentally show that considering secondary emotion labels during the learning process leads to relative improvements of 7.9% in F1-score for an 8-class emotion classification task.


Cite as: Lotfian, R., Busso, C. (2018) Predicting Categorical Emotions by Jointly Learning Primary and Secondary Emotions through Multitask Learning. Proc. Interspeech 2018, 951-955, DOI: 10.21437/Interspeech.2018-2464.


BiBTeX Entry:

@inproceedings{Lotfian2018,
author={Reza Lotfian and Carlos Busso},
title={Predicting Categorical Emotions by Jointly Learning Primary and Secondary Emotions through Multitask Learning},
year=2018,
booktitle={Proc. Interspeech 2018},
pages={951--955},
doi={10.21437/Interspeech.2018-2464},
url={http://dx.doi.org/10.21437/Interspeech.2018-2464} }