Chrome Extension
WeChat Mini Program
Use on ChatGLM

Visual and Inertial Data-Based Virtual Localization for Urban Combat

Smart Innovation, Systems and TechnologiesDevelopments and Advances in Defense and Security(2020)

Cited 0|Views1
No score
Abstract
The present work of investigation presents a system of estimation of position and orientation based on algorithms of artificial vision and inertial data taken from the unit of inertial measurement incorporated in a smartphone device. The implemented system realizes the estimation of position and orientation in real time. An application was developed for android operating systems that allows capturing the images of the environment and executes the algorithms of artificial vision. In the implementation of the system, the detectors of feature points were tested, Harris, Shi-Tomasi, FAST and SIFT, with the objective of finding the detector that allows to have an optimized system so that it can be executed by the processor of a system embedded as are smartphones. To calculate the displacement of the camera adhered to a mobile agent, the optical flow method was implemented. Additionally, gyroscope data incorporated in the smartphone was used to estimate the orientation of the agent. The system incorporates a simulation of estimated movement within a three-dimensional environment that runs on a computer. The position and orientation data are sent from the smartphone to the computer wirelessly through a Wi-Fi connection. The three-dimensional environment is a digital version of the central block of the Universidad de la Fuerzas Armadas ESPE where the tests of the implemented system were carried out.
More
Translated text
Key words
virtual localization,combat,urban,data-based
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined