Multimodal Affect Classification Using Deep Neural Networks
Authors: Friedhelm Schwenker
Abstract:
Research activities in human-computer interaction increasingly ad- dressed the aspect of integrating emotional intelligence into the overall system, and therefore the recognition of human emotions becomes important in such appli- cations. Human emotions are expressed through various kinds of modalities such as voice, facial expressions, hand/body gestures or bio-physiological patterns, and therefore the classification of human emotions should be considered as a multi- modal pattern recognition and machine learning problem. In this paper we propose artificial neural networks for the task of information fusion in multimodal affect classification.