earticle

논문검색

Human-computer Interaction using Pointing Gesture based on an Adaptive Virtual Touch Screen

초록

영어

Among the various hand gestures, pointing gesture is highly intuitive, and does not require any priori assumptions. A major problem for pointing gesture recognition is the difficulty of pointing fingertip tracking and the unreliability of the direction estimation. A novel real-time method is developed for pointing gesture recognition using Kinect based depth image and skeletal points tracking. An adaptive virtual touch screen is constructed instead of estimating pointing direction. When a user stands in a certain distance from a large screen to perform pointing behaviors, he interacts with the virtual touch screen as if it is just right in front of him. The proposed method is suitable for both large and small pointing gestures, and it’s not subject to users’ characteristics and environmental changes. Experiments have highlighted that the proposed approach is robust and efficient to realize human-computer interaction based on pointing gesture recognition by comparisons.

목차

Abstract
 1. Introduction
 2. Pointing Gesture Recognition
  2.1. Pointing Hand Segmentation
  2.2. Pointing Fingertip Detection
  2.3. Pointing Fingertip Tracking
  2.4. Pointing Gestures Recognition
 3. Virtual Touch Screen Construction
  3.1. Kinect Coordinate System
  3.2. Virtual Touch Screen
 4. Experimental Results and Analysis
  4.1. Experimental Environment
  4.2. Recognition of Pointing Gestures
  4.3. Target Selection
 5. Conclusions
 Acknowledges
 Reference

저자정보

  • Pan Jing School of Communication and Information Engineering, Shanghai University
  • Guan Ye-peng School of Communication and Information Engineering, Shanghai University, Key Laboratory of Advanced Displays and System Application, Ministry of Education

참고문헌

자료제공 : 네이버학술정보

    함께 이용한 논문

      ※ 원문제공기관과의 협약기간이 종료되어 열람이 제한될 수 있습니다.

      0개의 논문이 장바구니에 담겼습니다.