원문정보
초록
영어
Analysis of human emotion plays an important role in interaction between human and machine communication. The most expressive way to extract and understand of human emotion is by facial expression analysis. This paper proposes a novel recognition method of multiple emotions from facial expression running on mobile environments. Especially, we formulate the classification model of facial ambiguous emotions using a variance of the estimated facial feature points. First, we extract 65 landmark points from input stream using active appearance model, and we then analyze the changes of the values of the feature points to recognize a facial emotion by comparing with fuzzy k-NN classification. Finally, five types of the emotions are recognized and classified as a facial expression. To evaluate the proposed approach, we assess the ratio of success with iPhone camera views, and we achieve the best 93% accuracy in the experiments. The results show that the proposed method performed well in the recognition of facial emotion on mobile environments, and the implementation system can be represented by one of the example for augmented reality on displaying combination of real face video and virtual animation with user’s avatar.
목차
1. Introduction
2. Related Works
3. Active Appearance Model
3.1. Shape Model
3.2. Appearance Model
3.3. AAM Fitting
4. Proposed Emotion Recognition Approach
4.1. Emotion Classification Method
4.2. Classifier based on Fuzzy k-Nearest Neighbor
5. Experimental Results
6. Conclusion
Acknowledgements
References
