A Multimodal Affective Corpus for Affective Computing in Human-Computer Interaction

Motivation
Despite the many studies investigating emotional and cognitive states, their measurement still poses many challenging issues especially with respect to multimodal, mobile and transtemporal acquisition. Additionally, regarding the validation of the experimental induction, most of the studies limit their validation to one subjective modality. Further, previous studies restrict their induction to either cognitive or emotional elicitation and rarely include both states into one single dataset. uulmMAC is a database for affective computing and pattern recognition research of emotional and cognitive states, based on systematic induction of cognitive load (Overload, Underload) and emotional states (Interest, Frustration) as well as neutral states (Normal, Easy). The uulmMAC dataset consists of a total of 95 recording sessions from 57 subjects and is in particular:

  • designed and acquired in a mobile interactive human-computer setting
  • based on multimodal sensor data
  • involving transtemporal data acquisition including different recording times
  • validated via three different subjective modalities.

Combining these challenging issues related to mobile, interactive, multimodal, transtemporal and validated acquisition into one large dataset for both cognitive and emotional states are the main contributions of this work. Finally, considering the relevance of emotional Frustration and cognitive Overload in the emergence of stress, we believe that our uulmMAC database on emotional and cognitive load states can also be used for affective computing and machine learning applications in the field of stress recognition research.

Distribution

The uulmMAC database is available for non-commercial research.  You can get Part A, Part B, or Part A+B on request, to get access to the database you or your supervisor (with a permanent position at a university or non-commercial research institute) accept the end-user-licence-agreement (EULA).

uulmMAC-EULA (download)

Please download the form, sign and stamp and send it to: Dr. Dilana Hazer-Rau , Section Medical Psychology, Ulm University, (dilana.hazer-rau “at” uni-ulm.de) or Dr. Friedhelm Schwenker, Institute of Neural Information Processing, Ulm University, (friedhelm.schwenker “at” uni-ulm.de). Once you have accepted the EULA a download link will be made available within a 2-3 working days.

News

  • 2020-04: uulmMAC Database paper [7] accepted and published in Sensors (Sensors 2020, 20(8), 2308)
  • 2019-05: Doctoral Thesis by Sascha Meudt [6] published
  • 2017-09: Another preliminary detailed study on physiological signals [5]
  • 2016-12: two preliminary studies on speech, biopotentials [4] and multimodal fusion [3]
  • 2016-11: second study on gestures and postural behaviour [2]

 References

  • [7] Hazer-Rau, D., Meudt, S., Daucher, A., Spohrs, J., Hoffmann, H., Schwenker, F., and Traue, H. C. (2020, April): The uulmMAC Database – A Multimodal Affective Corpus for Affective Computing in Human-Computer Interaction, Sensors, 20(8), doi:10.3390/s20082308, 2020
  • [6] Meudt, S. (2019). Maschinelle Emotionserkennung in der Mensch-Maschine Interaktion (Doctoral dissertation, Universität Ulm).
  • [5] Daucher, A., Gruss, S., Jerg-Bretzke, L., Walter, S., Hazer-Rau, D. (2017, September). Preliminary classification of cognitive load states in a human machine interaction scenario. In Proceedings of the International Conference on Companion Technology ICCT’17 (pp. 1–5).
  • [4] Kindsvater, D., Meudt, S., & Schwenker, F. (2016, December). Fusion architectures for multimodal cognitive load recognition. In IAPR Workshop on Multimodal Pattern Recognition of Social Signals in Human-Computer Interaction (pp. 36-47). Springer, Cham.
  • [3] Held, D., Meudt, S., & Schwenker, F. (2016, December). Bimodal Recognition of Cognitive Load Based on Speech and Physiological Changes. In IAPR Workshop on Multimodal Pattern Recognition of Social Signals in Human-Computer Interaction (pp. 12-23). Springer, Cham.
  • [2] Hihn, H., Meudt, S., & Schwenker, F. (2016, November). Inferring mental overload based on postural behavior and gestures. In Proceedings of the 2nd workshop on Emotion Representations and Modelling for Companion Systems (pp. 1-4).
  • [1] Hihn, H., Meudt, S., & Schwenker, F. (2016, September). On gestures and postural behavior as a modality in ensemble methods. In IAPR Workshop on Artificial Neural Networks in Pattern Recognition (pp. 312-323). Springer, Cham.
>