EEG-Based Emotion Recognition Datasets for Virtual Environments: A Survey
DOI:
https://doi.org/10.70705/ppp.ltcs.2024.v02.i01.pp18-31Keywords:
virtual environments (VEs), electroencephalogram (EEG), biosignal sensorAbstract
When it comes to virtual environments (VEs), one of the biggest issues is how users struggle to handle ever-more-complicated
systems. Therefore, if computers could read human emotions, it would greatly improve the quality and consistency of
human-machine interactions. A system that can detect and respond to human emotions may be realized by modeling the human
emotional state and then employing an electroencephalogram (EEG) device as a biosignal sensor. This paper presents a
comprehensive overview of methods for emotion recognition based on electroencephalogram (EEG) data. It categorizes these
methods according to time domain, frequency domain, and feature extraction. The review focuses on datasets used in recent
studies that investigate EEG-based emotion classification and addresses the challenges associated with this field. Two types of
artificial intelligence-based algorithms, deep learning and machine learning, have become very prominent in the area of emotion
identification. Emotional ratings or labels should be included in the data used to assess the algorithms and models that are offered.
Building a professional experimental setting and developing a scientifically valid experimental user model involves specific
understanding in psychology, which might be challenging for certain researchers, particularly those working in computer science.
As a result, a lot of people that are into researching emotion detection models want to check their ideas and see how they stack
up against comparable works using certain criteria. Consequently, research is offered with the hope of laying the groundwork
for future efforts to improve the virtual contact experience via modeling human impact.

