earticle

논문검색

Aerial Object Detection and Tracking based on Fusion of Vision and Lidar Sensors using Kalman Filter for UAV

초록

영어

In this paper, we study on aerial objects detection and position estimation algorithm for the safety of UAV that flight in BVLOS. We use the vision sensor and LiDAR to detect objects. We use YOLOv2 architecture based on CNN to detect objects on a 2D image. Additionally we use a clustering method to detect objects on point cloud data acquired from LiDAR. When a single sensor used, detection rate can be degraded in a specific situation depending on the characteristics of sensor. If the result of the detection algorithm using a single sensor is absent or false, we need to complement the detection accuracy. In order to complement the accuracy of detection algorithm based on a single sensor, we use the Kalman filter. And we fused the results of a single sensor to improve detection accuracy. We estimate the 3D position of the object using the pixel position of the object and distance measured to LiDAR. We verified the performance of proposed fusion algorithm by performing the simulation using the Gazebo simulator.

목차

Abstract
1. Introduction
2. Aerial Object Detection and Position Estimation
2.1 Aerial object detection using vision sensor
2.2 Aerial object detection using LiDAR
2.3 Sensor fusion using Kalman filter
2.4 Position estimation for object tracking
3. Simulation Results
4. Conclusions
References

저자정보

  • Cheonman Park Master’s student, Department of Aeronautical Systems Engineering, Hanseo University, Korea
  • Seongbong Lee Undergraduate student, Department of Unmanned Aircraft Systems, Hanseo University, Korea
  • Hyeji Kim Ph.D. student, Department of Aeronautical Systems Engineering, Hanseo University, Korea
  • Dongjin Lee Associate professor, Department of Unmanned Aircraft Systems, Hanseo University, Korea

참고문헌

자료제공 : 네이버학술정보

    함께 이용한 논문

      ※ 기관로그인 시 무료 이용이 가능합니다.

      • 4,000원

      0개의 논문이 장바구니에 담겼습니다.