earticle

논문검색

Visually Gesture Recognition for an Interactive Robot Grasping Application

초록

영어

Gesture based natural human-robot interaction paradigm has few physical requirements, and thus can be deployed in many restrictive and challenging environments. In this paper, we propose a robot vision based approach to recognizing intentional arm- pointing gestures of human for an object grasping application. To overcome the limitation of robot onboard vision quality and background cluttering in natural indoor environment, a multi-cue human detection method is proposed. Human body is detected and verified by merging appearance and color features with robust head-shoulder based shape matching for reducing the false detection rate. Then intentional dynamic arm- pointing gestures of a person are identified using Dynamic Time Warping (DTW) technique, whilst unconscious motions of arm and head are rejected. Implementation of a gesture-guided robot grasping task in an indoor environment is given to demonstrate this approach, in which a fast and reliable recognition of pointing gesture recognition is achieved.

목차

Abstract
 1. Introduction
 2. Human Body Detection and Verification
  2.1. Robust Color Detection Using Color Probability Density Map
  2.2. Human Target Verifications
 3. Pointing Posture Recognition
 4. Experiments and result
 5. Conclusion
 Acknowledgements
 References

저자정보

  • Kun Qian School of Automation, Southeast University, No.2 Sipailou, Nanjing, 210096, China
  • Chunhua Hu School of Information Science and Technology, Nanjing Forestry University No.159 Longpan Road, Nanjing, 210037, China

참고문헌

자료제공 : 네이버학술정보

    함께 이용한 논문

      ※ 원문제공기관과의 협약기간이 종료되어 열람이 제한될 수 있습니다.

      0개의 논문이 장바구니에 담겼습니다.