EEG Dataset for the Recognition of Different Emotions Induced in Voice-User Interaction

Ga Young Choi, Jong Gyu Shin, Ji Yoon Lee, Jun Seok Lee, In Seok Heo, Ha Yeong Yoon, Wansu Lim, Jin Woo Jeong, Sang Ho Kim, Han Jeong Hwang

Research output: Contribution to journalArticlepeer-review

Abstract

Electroencephalography (EEG)-based open-access datasets are available for emotion recognition studies, where external auditory/visual stimuli are used to artificially evoke pre-defined emotions. In this study, we provide a novel EEG dataset containing the emotional information induced during a realistic human-computer interaction (HCI) using a voice user interface system that mimics natural human-to-human communication. To validate our dataset via neurophysiological investigation and binary emotion classification, we applied a series of signal processing and machine learning methods to the EEG data. The maximum classification accuracy ranged from 43.3% to 90.8% over 38 subjects and classification features could be interpreted neurophysiologically. Our EEG data could be used to develop a reliable HCI system because they were acquired in a natural HCI environment. In addition, auxiliary physiological data measured simultaneously with the EEG data also showed plausible results, i.e., electrocardiogram, photoplethysmogram, galvanic skin response, and facial images, which could be utilized for automatic emotion discrimination independently from, as well as together with the EEG data via the fusion of multi-modal physiological datasets.

Original languageEnglish
Article number1084
JournalScientific Data
Volume11
Issue number1
DOIs
StatePublished - Dec 2024

Fingerprint

Dive into the research topics of 'EEG Dataset for the Recognition of Different Emotions Induced in Voice-User Interaction'. Together they form a unique fingerprint.

Cite this