An Integrated Navigation Method for UAV Autonomous Landing Based on Inertial and Vision Sensors.

CICAI (2)(2022)

引用 0|浏览0
暂无评分
摘要
In the process of autonomous landing of unmanned aerial vehicles (UAV), the vision sensor is restricted by the field of view and UAV maneuvering process, which may make the acquired relative position/attitude parameters unstable or even odd (not unique), and there is a ‘blind area’ of vision measurement in the UAV rollout stage, which loses the navigation ability and seriously affects the safety of landing. In this paper, an autonomous landing navigation method based on inertial/visual sensor information fusion is proposed. When the UAV is far away from the airport and the runway imaging is complete, landing navigation parameters are determined by vision sensor based on the object image conjugate relationship of the runway sideline, and fuses with the inertial information to improve the measure performance. When the UAV is close to the airport and the runway imaging is incomplete, the measurement information of the vision sensor appears singular. The estimation of the landing navigation parameters is realized by inertial information in the aid of vision. When the UAV rollouts, the vision sensor enters the ‘blind area’, judges the UAV’s motion state through the imaging features of two adjacent frames, and suppresses the inertial sensor error by using the UAV’s motion state constraint, so as to achieve the high-precision maintenance of landing navigation parameters. The flight test shows that the lateral relative position error is less than 10m when the inertial with low accuracy and visual sensor are used, which can meet the requirement of UAV landing safely.
更多
查看译文
关键词
uav autonomous landing,integrated navigation method,vision sensors,inertial
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要