원문정보
초록
영어
This paper presents an alternative solution for applying MetaHuman facial animations using MediaPipe, providing a versatile option to the Live Link iPhone system. Our approach involves capturing facial expressions with various camera devices, including webcams, laptop cameras, and Android phones, processing the data for landmark detection, and applying these landmarks in Unreal Engine Blueprint to animate MetaHuman characters in real-time. Techniques such as the Eye Aspect Ratio (EAR) for blink detection and the One Euro Filter for data smoothing ensure accurate and responsive animations. Experimental results demonstrate that our system provides a cost-effective and flexible alternative for iPhone non-users, enhancing the accessibility of advanced facial capture technology for applications in digital media and interactive environments. This research offers a practical and adaptable method for real-time facial animation, with future improvements aimed at integrating more sophisticated emotion detection features.
목차
1. Introduction
2. Background
3. Experiments and Process
3.1 Initial Setup and Programming
3.2. Eye Aspect Ratio (EAR) Calculation
3.3. Smoothing Data with One Euro Filter
3.4. Integrating Data into Unreal Engine
4. Results and Discussion
5. Conclusion
References
