Formulation of micro-rover autonomy software for lunar exploration

Varsha Kumar, Shyam S. Sai, Srinivas Vijayarangan,David Wettergreen,Heather Jones, - PatrickCalla, Ghan,Haidar Jamal,William L. Whittaker

semanticscholar(2020)

Cited 0|Views7
No score
Abstract
Micro-rovers offer immense advantages of low mass, low cost and frequent flight opportunities. Due to the constraint of low mass micro-rovers of our time cannot be isotope-heated. Therefore, they cannot survive the extended planetary nights, so they must achieve their exploration goals in a single daylight period. Their small size, mass and power precludes a radio for direct communication with Earth. For this reason, they can only receive and relay data while in proximity to their lander, and hence, they cannot be constantly supervised or teleoperated from Earth like larger rovers with greater power and communication capability. In order to explore beyond lander communication range, micro-rovers must operate autonomously. Micro-rover autonomy software must achieve communication-denied, high-cadence, kilometer-scale exploration treks. This paper formulates a software architecture and component-wise design for achieving the required autonomous micro-rover exploration. This technology will be integral to the MoonRanger micro-rover, which will fly to the lunar pole in December 2022 as a Lunar Surface Instrument and Technology Payload (LSITP) aboard the Masten XL1 lander. MoonRanger will conduct long treks from and to the lander to explore for lunar polar ice. The software will incorporate perception, planning, navigation, and execution, log data. Upon return to its lander, it will transfer data, images and scientific information that are result from mission-relevant autonomy. It will do so at a leap of performance beyond that achieved in prior planetary roving, but with the power, sensing and size constraints of microroving. MoonRanger hosts a two-computer system consisting of a space-hardened embedded processor and a higher-performance, less-hardened computer. Autonomy software and image processing run on a Linux-based OS on the higher-performance computer, while motor control, sensor data collection, and low-level functionality run on a real-time OS aboard the embedded processor. A design prototype of the higher-performance computer’s software is depicted in Figure 1. As shown in the figure, this software is organized into two categories, the navigation pipeline and the execution nodes. The navigation pipeline performs perception, rover pose estimation and planning, while the execution nodes handle executive control, data management and transfer, health monitoring and telemetry management. Figure 1: Software Architecture. The navigation pipeline is depicted in violet. Execution nodes are depicted in blue. The Global Planner will run on ground software. The lander, cameras, embedded processor, and disk are external to the design.
More
Translated text
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined