Abstract: The curse of dimensionality is a well-established phenomenon. However, the properties of high dimensional data are often poorly understood and overlooked during the process of data modelling and analysis. Similarly, how to optimally fuse different modalities is still a big research question. In this paper, we addressed these challenges by proposing a novel two level brain-inspired compression based optimised multimodal fusion framework for emotion recognition. In the first level, the framework extracts the compressed and optimised multimodal features by applying a deep convolutional neural network (CNN) based compression on each modality (i.e. audio, text, and visuals). The second level simply concatenates the extracted optimised and compressed features for classification. The performance of the proposed approach with two different compression levels (i.e. 78% and 98%) is compared with late fusion (class level- 1 dimension, class probabilities level-4 dimension) and early fusion (feature level-72000 dimension). The simulation results and critical analysis have demonstrated up to 10% and 5% performance improvement as compared to the state-of-the-art support vector machine (SVM) and long-short-term memory (LSTM) based multimodal emotion recognition systems respectively. We hypothesise that there exist an optimal level of compression at which optimised multimodal features could be extracted from each modality, which could lead to a significant performance improvement.

Pdf: https://mandargogate.github.io/papers/SSCI2017-Multimodal-Fusion-Emotion-Recognition.pdf


  author    = {Mandar Gogate and
               Ahsan Adeel and
               Amir Hussain},
  title     = {A novel brain-inspired compression-based optimised multimodal fusion
               for emotion recognition},
  booktitle = {2017 {IEEE} Symposium Series on Computational Intelligence, {SSCI}
               2017, Honolulu, HI, USA, November 27 - Dec. 1, 2017},
  pages     = {1--7},
  year      = {2017},
  url       = {https://doi.org/10.1109/SSCI.2017.8285377},
  doi       = {10.1109/SSCI.2017.8285377},

Leave a Reply