원문정보
초록
영어
People have lived in a way delivering many messages through various non-verbal communicative means such as facial expressions and voice tones. Recently, many researchers have attempted to do tasks dealing with emotions conveyed through some immanent bio-signals for human-computer interaction or human-robot interaction. If we switch some applications of bio-signals dominantly occupied to cure the handicapped or the rehabilitant to entertainment areas including video games, artificial life and interactive theaters, this will provide us with a solution to overcome drawbacks caused by traditional interfaces in which it may allow us to more naturally experience human-oriented contents. In this paper, we present a mechanism to do analysis of electroencephalogram signals originated from emotional impulses and to carry out classification of the emotion. Also we have validated the methods to foresee usability for brain-computer interface. The partial derivatives of EEG are taken as features for some training data set gained from facial expression mirroring. Four emotions including neutral, anger, happiness, and surprise have been classified using the support vector machines. The experimental results can be extended in providing innovative potential in the areas including games, virtual reality, agents, and various entertainments.
목차
1. Introduction
2. Data Acquisition
2.1 The Experimental Setup
2.2 The Procedure of the Experiments
2.3 The Preprocessing
3. Feature Extractions
4. Expression Classification
5. Experiments and Discussion
5.1. EEG Classifier
5.2. Performance Test
6. Conclusions
Acknowledgements
References