ESA GNC Conference Papers Repository
Title:
Towards Validation and Verification of Autonomous Vision-Based Navigation for Interplanetary Spacecraft
Authors:
Presented at:
Full paper:
Abstract:
Abstract The number of spacecraft launched per year is increased dramatically in the last decades granting access to private companies and public actors. Space assets are becoming crucial to asses disaster monitoring, precise agriculture, and global network interconnections. This trend is not limited to Earth observation applications, but it extends beyond Earth's orbit to support space exploration and exploitation. The current paradigm to operate interplanetary spacecraft strongly relies on the Deep-Space Network (DSN) which communicates with the spacecraft to obtain range and range-rate measurements. These data are then processed by large teams of engineers on ground to solve the orbit determination problem and the required maneuvers. Although this process is extremely precise and has been used since the beginning of space exploration, the increasing number of spacecraft and the riskier operations needed to support compelling science are making it outdated. First, the DSN has a limited number of communication slots which implies that a small number of spacecraft can be operated. Second, the process is extremely costly as large teams of individuals are involved in it. Finally, the delayed communications between the spacecraft and the ground station make some operations, such as landing or sampling, infeasible as the spacecraft does not have the needed reactivity. Because of these reasons, autonomous navigation is becoming a crucial technology for present and future missions. Among all the navigation sensors, cameras are generally preferred because they are light, compact, and low-priced. For this reason, Vision-Based Navigation (VBN), i.e., the combination of camera and image processing (IP) algorithms, is generally employed as an autonomous solution to solve the navigation problem. When a spacecraft is on an interplanetary cruise, it can determine its position by using known planet position within the Solar System. When the planet lines of sight (LoS) measurements are available, the spacecraft can triangulate its position in the inertial reference frame by knowing the planet ephemeris. This can be performed statically [1, 2], when more than one planet is available, or dynamically, by providing the LoSes measurements history to a navigation filter [3, 4]. The planet LoS determination can be performed by extracting the planet position from images by performing attitude determination and by knowing the planet ephemeris [5, 6]. This is a fully autonomous solution as the spacecraft does not require any piece of information from ground. The proposed solution is thus composed of an IP pipeline, which determines autonomously its attitude and extracts the planet LoSes, and a navigation filter, which determines the spacecraft state by taking into account light aberrations [7]. An important step to be performed is the algorithm validation process which is generally performed by increasing the simulation framework complexity and by including hardware-in-the-loop (HIL) components. Andreis et al. [4] develops and analyses the navigation filtering strategy to be deployed on board by assuming IP behavioral model, while Andreis et al. [6] and Andreis et al. [5] develop the IP pipeline and test it on synthetic images from a custom-designed rendering engine [8]. Andreis et al. [7] further develop the VBN algorithm by proposing an integrated solution to compensate for light aberrations. Finally, Panicucci et al. [9] assesses the IP performances on images acquired on RETINA, a HIL optical navigation test bench. In this context, a high-resolution screen stimulates a camera to acquire images as they would be taken in orbit. Standing on previous work, this paper presents the validation of the VBN algorithm on HIL simulation. First, a series of images are acquired on RETINA by simulating the reference trajectory and the attitude profile of the spacecraft. These images are processed sequentially by the VBN algorithm. Spacecraft state estimates are compared against the true value to assess navigation accuracy. Acknowledgments This research is part of EXTREMA, a project that has received funding from the European Research Council (ERC) under the European Unions Horizon 2020 research and innovation programme (Grant Agreement No. 864697). References [1] V. Franzese and F. Topputo. Optimal beacons selection for Deep-Space optical navigation. The Journal of the Astronautical Sciences, 67(4):17751792, 2020. doi: 10.1007/s40295-020-00242-z. [2] S. B. Broschart, N. Bradley, and S. Bhaskaran. Kinematic approximation of position accuracy achieved using optical observations of distant asteroids. Journal of Spacecraft and Rockets, 56 (5):13831392, 2019. doi: 10.2514/1.A34354. [3] R. R. Karimi and D. Mortari. Interplanetary autonomous navigation using visible planets. Journal of Guidance, Control, and Dynamics, 38(6):11511156, 2015. doi: 10.2514/1.G000575. [4] E. Andreis, V. Franzese, and F. Topputo. Onboard Orbit Determination for Deep-Space CubeSats. Journal of Guidance, Control, and Dynamics, pages 114, 2022. doi: 10.2514/1.G006294. [5] E. Andreis, P. Panicucci, and F. Topputo. An Image Processing Pipeline for Autonomous Deep-Space Optical Navigation. Journal of Spacecraft and Rockets, Under Review. [6] E. Andreis, P. Panicucci, V. Franzese, and F. Topputo. A Robust Image Processing Pipeline for Planets Line-Of-sign Extraction for Deep-Space Autonomous Cubesats Navigation. In 44th AAS Guidance, Navigation and Control Conference, pages 119, 2022. [7] E. Andreis, P. Panicucci, V. Franzese, and F. Topputo. A Vision-Based Navigation algorithm for Autonomous Deep-Space Cruise. In 3rd Space Imaging Workshop, 2022. [8] S. Bella, E. Andreis, V. Franzese, P. Panicucci, and F. Topputo. Line-of-Sight Extraction Algorithm for Deep-Space Autonomous Navigation. In 2021 AAS/AIAA Astrodynamics Specialist Conference, pages 118, 2021. [9] P. Panicucci, Andreis E., V. Franzese, and F. Topputo. An Overview of the EXTREMA Deep-Space Optical Navigation Experiment. In 3rd Space Imaging Workshop, 2022.