원문정보
초록
영어
This paper proposed an algorithm for 3D hands tracking on the learned hierarchical latent variable space, which employs a Hierarchical Gaussian Process Latent Variable Model(HGPLVM) to learn the hierarchical latent space of hands motion and the nonlinear mapping from the hierarchical latent space to the pose space simultaneously. Nonlinear mappings from the hierarchical latent space to the space of hand images are constructed using radial basis function interpolation method. With these mappings, particles can be projected into hand images and measured in the image space directly. Particle filters with fewer particles are used to track the hand on the learned hierarchical low-dimensional space. Then the Hierarchical Conditional Random Field, which can capture extrinsic class dynamics and learn the relationship between motions of hand parts and different hand gestures simultaneously, is presented to model the continuous hand gestures. Experimental results show that our proposed method can track articulated hand robustly and approving recognition performance has also been achieved on the user-defined hand gesture dataset.
목차
1. Introduction
2. Human Hand Tracking Method
2.1. Method based on Appearance
2.2. Model-based Method
3. Hierarchical Latent Variable Space of Human Hand Movement
4. 3D Human Hand Tracking
4.1. Non-linear Mapping from Hierarchical Latent Variable Space to Image Space
4.2 Tracking Algorithm
5. Experiment Design and Discussion
5.1. Experimental Design
5.2. The Results of Experiment and Analysis
6. Conclusion
References
