• Skip to main content
  • Skip to header right navigation
  • Skip to site footer
HCI Games Group

HCI Games Group

Researching Affective Systems and Engaging Interactions

  • Home
  • Blog
  • People
  • Opportunities
  • Projects
  • Publications
  • Teaching
  • Contact
  • CLICK ME!!!
The HCI Games Group in 2020.

Biosignal Datasets for Emotion Recognition

You are here: Home / Games User Research / Biosignal Datasets for Emotion Recognition
June 8, 2016 by Gustavo Tondello

Written by Mike Schaekermann.

At the HCI Games Group, we love looking at emotion as a core driver of gameplay experience. One common technique used to find out how players experience a game prototype and what affective responses in-game interaction triggers, is to ask players how they feel after playing the game. For this purpose, different affective dimensions like arousal (i.e., the level of excitement), valence (i.e., good or bad) or dominance (i.e., how much the player felt in control) are often used to quantify subjective phenomena. As you can imagine, these types of self-reports are extremely valuable to iteratively improve a game prototype.

Self-Assessment Manikin used to quantify affective dimensions (valence and arousal)

However, one drawback of post-hoc questionnaires is that certain types of emotions are temporary bursts of experience which may fade over time. This becomes a problem if the goal is to investigate affective responses in real-time. To work around this problem, the use of biosignals like brain activity (e.g., through electroencephalography or magnetoencephalography), heart rate (e.g., through electrocardiography), skin conductance level, skin temperature or muscular activity (via electromyograms) has been suggested in the literature [1]. A major focus of the HCI Games Group is in this area.

Repidly is a collaborative tool for biosignal-based gameplay analysis, developed in the HCI Games Group

The main challenge of this biometric approach lies in finding a reliable mapping from physiological patterns to affective states as experienced by the player. The goal is to quantify certain affective dimensions in real-time, without interrupting the game flow to interview the player about their emotional state. Therefore, it is helpful to create automated methods to estimate amplitudes of affective dimensions based on real-time physiological measurements on the fly. However, datasets containing all the information needed to create such automated methods, are costly, hard to find online, and tedious to acquire yourself.

In this blog post, we hope to provide a usable synopsis of freely available datasets that can be used by anyone who is interested in doing research in this area. The four datasets we summarize in this post are MAHNOB-HCI (Soleymani et al.), EMDB (Carvalho et al.), DEAP (Koelstra et al.), and DECAF (Abadi et al.). All these datasets were acquired by presenting multimedia content (i.e., images or videos) to the participants, and recording various physiological responses the participants had to this content. To allow for a mapping from physiological to affective responses, all of the datasets contain subjective self-reports about affective dimensions like arousal, valence, and dominance.


MAHNOB-HCI (Multimodal Database for Affect Recognition and Implicit Tagging)

Authors/Institutions: Soleymani et al. (Imperial College London, Intelligent Behaviour Understanding Group)
Year: 2011
Website: MAHNOB Databases
Publication: A Multimodal Database for Affect Recognition and Implicit Tagging [2]
Participants: 27 (11 male, 16 female)
Stimuli: fragments of movies and pictures
Recorded Signals: 6 face and body cameras (60fps), head-worn microphone (44.1kHz), eye gaze tracker (60Hz), electrocardiogram (ECG), electroencephalogram (32 channels), skin temperature and respiration amplitude (all biosignals at 256Hz)
Subjective scores: arousal, valence, dominance and predictability (both on a scale from 1 to 9)
How to access: Follow the instructions on the request a user account page. In order to receive access to the dataset, you will need to sign an end user license agreement.


EMDB (Emotional Movie Database)

Authors: Carvalho et al.
Year: 2012
Publication: The Emotional Movie Database (EMDB): A Self-Report and Psychophysiological Study [3]
Participants: 113 for self-report ratings of movie clips and 32 for biometrics
Stimuli: 52 movie clips without auditory content from different emotional categories
Recorded Signals: skin conductance level (SCL) and heart rate (HR)
Subjective scores: arousal, valence and dominance (all on a scale from 1 to 9)
How to access: Send a request for the database to EMDB@psi.uminho.ptand follow the instructions in the response. When submitting a request to use the database, you will be asked to confirm that it will be used solely for non-profit scientific research and that the database will not be reproduced or broadcast in violation of international copyright laws.


DEAP (Database for Emotion Analysis using Physiological Signals)

Authors/Institutions: Koelstra et al. (peta media, Queen Mary University of London, University of Twente, University of Geneva, EPFL)
Year: 2012
Website: DEAP Dataset
Publication: DEAP: A Database for Emotion Analysis ;Using Physiological Signals[4]
Participants: 32
Stimuli: 40x 1-minute long excerpts from music videos from last.fm (retrieved using affective tags, video highlight detection and an online assessment tool)
Recorded Signals: electroencephalogram (32 channels at 512Hz), skin conductance level (SCL), respiration amplitude, skin temperature, electrocardiogram, blood volume by plethysmograph, electromyograms of Zygomaticus and Trapezius muscles (EMGs), electrooculogram (EOG), face video (for 22 participants)
Subjective scores: arousal, valence, like/dislike, dominance (all on a scale from 1 to 9), familiarity (on a scale from 1 to 5)
How to access: Download, print and sign the end user license agreement and send a signed copy to the authors. You will then receive a password to be entered when you click on the download links, as described here


DECAF (Multimodal Dataset for Decoding Affective Physiological Responses)

Authors/Institutions: Abadi et al. (University of Trento, FBK, ADSC, CiMeC, Semantics & Knowledge Innovation Lab, Telecom Italia)
Year: 2015
Website: MHUG
Publication: DECAF: MEG-Based Multimodal Database for Decoding Affective Physiological Responses [5]
Participants: 30
Stimuli: 40x 1-minute long excerpts from music videos (same as in the DEAP dataset), 36x movie clips
Recorded Signals: magnetoencephalogram (MEG), horizontal electrooculogram (hEOG), electrocardiogram (ECG), electromyogram of the Trapezius muscle (tEMG), near-infrared face video
Subjective scores: arousal, valence, dominance (all on a scale from 1 to 9), time-continuous emotion annotations for movie clips (from 7 experts)
How to access: Download, print and sign the end user license agreement and send a signed copy to the authors. You will then receive a password to be entered when you click on the download links, as described here


As you can see, not all of the datasets use the same stimuli or collect the same types of physiological and self-reported data. It is interesting to note, that the common denominator for self-reported data are the three affective scores (arousal, valence, and dominance) collected on a discrete scale from 1 to 9. As for physiological recordings, heart rate is the only metric that can be found in all four datasets. EEG, skin temperature and respiration amplitude are collected in MAHNOB-HCI and DEAP. EMG of the Trapezius muscle and EOG are collected in both DEAP and DECAF. Skin conductance levels are available in DEAP and EMDB. In contrast, face videos are recorded for MANHOB-HCI, DEAP and DECAF. Finally, DECAF is the only dataset providing MEG brain activity data. Luckily however, as DECAF uses the same music video stimuli as DEAP. Correlations between the EEG brain responses found in DEAP and the MEG brain responses found in DECAF, can be investigated.

We really hope to have provided a fair and useful overview on these four datasets and look forward to seeing more research being done in the area of biosignal-based affect recognition in the future — as we are really excited about this topic.


Contact

If you are interested and have some more questions, I would love you to get in touch — either via Twitter: @HardyShakerman or email: mikeschaekermann@gmail.com. Cheers, Mike Schaekermann


Acknowledgements

We would love to thank the creators of these wonderful datasets for investing an incredible amount of time and work into data collection and for making these invaluable datasets freely available for research.


More from HCI Games Group

  • Repidly: A Lightweight Tool for the Collaborative Analysis of Biosignals and Gameplay Videos
  • Three CHI 2016 Papers That Will Change the Way you Think About Game Design
  • 15 Ways Gamification Can Be Applied to Education
  • Gamification: The Pursuit of Progression

References

[1] Nacke, L., & Lindley, C. A. (2008). Flow and immersion in first-person shooters: measuring the player’s gameplay experience. In Proceedings of the 2008 Conference on Future Play: Research, Play, Share (pp. 81–88). New York, NY, USA: ACM. http://doi.org/10.1145/1496984.1496998

[2] Soleymani, M., Lichtenauer, J., Pun, T., & Pantic, M. (2012). A Multimodal Database for Affect Recognition and Implicit Tagging. IEEE Transactions on Affective Computing, 3(1), 42–55. http://doi.org/10.1109/T-AFFC.2011.25

[3] Carvalho, S., Leite, J., Galdo-Álvarez, S., & Gonçalves, Ó. F. (2012). The Emotional Movie Database (EMDB): A Self-Report and Psychophysiological Study. Applied Psychophysiology and Biofeedback, 37(4), 279–294. http://doi.org/10.1007/s10484-012-9201-6

[4] Koelstra, S. (2012). Deap: A database for emotion analysis; using physiological signals. Affective Computing, 3(1), 18–31. http://doi.org/10.1109/T-AFFC.2011.15

[5] Abadi, M. K., Subramanian, R., Kia, S. M., Avesani, P., Patras, I., & Sebe, N. (2015). DECAF: MEG-Based Multimodal Database for Decoding Affective Physiological Responses. IEEE Transactions on Affective Computing, 6(3), 209–222. http://doi.org/10.1109/TAFFC.2015.2392932

Gustavo's Profile Pic
Gustavo Tondello
Co-Founder, Software Engineer, Gamification Specialist at MotiviUX | Website | + posts

Dr. Gustavo Tondello was an instructor and support coordinator for the Cheriton School of Computer Science. He was a Ph.D. student at the University of Waterloo under the supervision of Dr. Lennart Nacke and Dr. Daniel Vogel and a graduate researcher at the HCI Games Group. He is a co-founder of MotiviUX and a member of the International Gamification Federation. His research interests include gamification and games for health, wellbeing, and learning, user experience in gamification, and gameful design methods. His work focuses on the design and personalization of gameful applications. His publications advanced the current knowledge on player and user motivations in games and gameful applications and introduced new frameworks and approaches to designing personalized gameful applications and serious games. He periodically blogs about gamification for the HCI Games Group and on his personal blog, Gameful Bits. Before coming to Canada, Gustavo earned his M.Sc. in Computer Science and his B.Sc. in Information Systems from the Federal University of Santa Catarina (UFSC) and worked for several years as a Software Engineer in Brazil. Gustavo is also a Logosophy researcher affiliated with the Logosophical Foundation of Brazil and North America.

  • Gustavo Tondello
    https://hcigames.com/author/gustavo/
    The Use of Games and Play to Achieve Real-World Goals
  • Gustavo Tondello
    https://hcigames.com/author/gustavo/
    The Gameful World
  • Gustavo Tondello
    https://hcigames.com/author/gustavo/
    The HCI Games Group will be at CHI PLAY 2015
  • Gustavo Tondello
    https://hcigames.com/author/gustavo/
    Playful Interactions at CHI PLAY 2015
Category: Games User Research, HCI, ResearchTag: Biosignal, Emotion, GUR, hci

About Gustavo Tondello

Dr. Gustavo Tondello was an instructor and support coordinator for the Cheriton School of Computer Science. He was a Ph.D. student at the University of Waterloo under the supervision of Dr. Lennart Nacke and Dr. Daniel Vogel and a graduate researcher at the HCI Games Group. He is a co-founder of MotiviUX and a member of the International Gamification Federation. His research interests include gamification and games for health, wellbeing, and learning, user experience in gamification, and gameful design methods. His work focuses on the design and personalization of gameful applications. His publications advanced the current knowledge on player and user motivations in games and gameful applications and introduced new frameworks and approaches to designing personalized gameful applications and serious games. He periodically blogs about gamification for the HCI Games Group and on his personal blog, Gameful Bits. Before coming to Canada, Gustavo earned his M.Sc. in Computer Science and his B.Sc. in Information Systems from the Federal University of Santa Catarina (UFSC) and worked for several years as a Software Engineer in Brazil. Gustavo is also a Logosophy researcher affiliated with the Logosophical Foundation of Brazil and North America.

Previous Post: « Three CHI 2016 Papers That Will Change the Way you Think About Game Design
Next Post: 3 Inspiring Ways Gamification Is Being Used in Education »

Reader Interactions

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Sidebar

Recent Posts

  • HCI Games Group’s Field Guide to CHI 2021
  • The Challenge of Knowledge Translation
  • An Interview With Horror Sound Designer Orest Sushko || Part III – What Horror Games Can Learn From The Sound Design of Horror Movies

Archives

  • May 2021
  • February 2021
  • August 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • February 2017
  • October 2016
  • September 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • February 2016
  • October 2015
  • August 2015
  • July 2015
  • June 2015
  • March 2015

Categories

  • Book Review
  • Conferences
  • Game Design
  • Games User Research
  • Gamification
  • Gaming Experiences
  • HCI
  • Interviews
  • News
  • Projects
  • Publications
  • Research
  • Social Media Update/Blog
  • Talks
  • Teaching

Archives

  • May 2021
  • February 2021
  • August 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • February 2017
  • October 2016
  • September 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • February 2016
  • October 2015
  • August 2015
  • July 2015
  • June 2015
  • March 2015

Categories

  • Book Review
  • Conferences
  • Game Design
  • Games User Research
  • Gamification
  • Gaming Experiences
  • HCI
  • Interviews
  • News
  • Projects
  • Publications
  • Research
  • Social Media Update/Blog
  • Talks
  • Teaching

Archives

  • May 2021
  • February 2021
  • August 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • February 2017
  • October 2016
  • September 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • February 2016
  • October 2015
  • August 2015
  • July 2015
  • June 2015
  • March 2015

Categories

  • Book Review
  • Conferences
  • Game Design
  • Games User Research
  • Gamification
  • Gaming Experiences
  • HCI
  • Interviews
  • News
  • Projects
  • Publications
  • Research
  • Social Media Update/Blog
  • Talks
  • Teaching
  • Facebook
  • Twitter
  • LinkedIn
  • Instagram
  • YouTube

Copyright © 2023 · HCI Games Group · All Rights Reserved. We acknowledge that we live and work on the traditional territory of the Neutral, Anishinaabeg, and Haudenosaunee peoples. The University of Waterloo is situated on the Haldimand Tract, the land promised to the Six Nations that includes six miles on each side of the Grand River. We wish to honour the ancestral guardians of this land and its waterways: the Anishinaabe, the Haudenosaunee Confederacy, the Wendat, and the Neutrals. Many Indigenous peoples continue to call this land home and act as its stewards, and this responsibility extends to all peoples, to share and care for this land for generations to come.