Synthesis and Validation of Vision Based Spacecraft Navigation

Alessandro Salvatore Massaro

    Research output: Book/ReportPh.D. thesis

    3233 Downloads (Pure)

    Abstract

    This dissertation targets spacecraft navigation by means of vision based sensors. The goal is to achieve autonomous, robust and ecient navigation through a multidisciplinary research and development effort, covering the fields of computer vision, electronics, optics and mechanics.
    The attention of space organizations worldwide, both public and private, is once again directed at our natural satellite. The Moon offers an unimaginably rich reservoir of resources exposed on its surface; a prime example being Helium-3. Furthermore, its distance from Earth's electromagnetic interferences and its lack of atmosphere make it a naturally optimal location for scientific observation of Earth and outer space. Finally, it is an ideal location for establishing outposts for deeper Solar System exploration. Despite the successful endeavours of the past century, direct or remote manned operation of vehicles directed to the Moon's surface is still prohibitively expensive and not ideal for missions such as cargo delivery. The first part of this book focuses on a lunar landing scenario as case study and discusses software and hardware components for an optimal vision based sensor for precision planetary landing. Computer vision techniques are applied to the problems of horizontal velocity estimation, and hazard detection. Experimental implementations are henceforth presented and the results show their potential for integration on a space qualified processing unit. The study concludes with recommendations for key physical parameters of the camera system.
    In connection with the PRISMA experimental mission for rendezvous and docking and formation flight, DTU Space has implemented, own and validated the Vision Based Sensor (VBS). This sensor has required development of novel techniques for calibration of the target optical model and custom hardware verification tools, both described in this book. One such tool personally developed is Pharos, an electro-opto-mechanical stimulator that physically interfaces with the camera to simulate the conditions of far range rendezvous in space. Pharos is now also being used by the Department to verify algorithms for asteroid detection, installed on the Juno spacecraft on its way to Jupiter.
    Another important outcome of the R&D effort of this project has been the integration of a calibration and validation facility for the vision based sensors developed at DTU Space. The author's work has covered all phases from concept to design and construction of the laboratory, which is equipped with precise manipulators and a controlled lighting setup in order to simulate the kinematics and optical conditions under which the sensors will operate. Testing of sensors and algorithms for the upcoming ESA PROBA-3 mission is currently under way. The laboratory also includes a physical analog terrain for verification of planetary landing algorithms.
    The general methods of autonomous navigation investigated and described in this book have also been applied to two external projects. The research stay at the NASA Ames Research Center's Intelligent Robotics Group (ARCIRG) resulted in the successful implementation of an infrastructure-free global localization algorithm for surface robotic navigation. The algorithm is now integrated with other rover navigation routines developed by IRG. Finally, collaboration with DTU Automation culminated in the development of a novel terrain mapping and obstacle detection technique based on Gaussian processes. These results have been published on a peer-reviewed conference paper at the 2011 IEEE International Conference on Machine Learning and Applications.
    Original languageEnglish
    Place of PublicationKgs. Lyngby
    PublisherTechnical University of Denmark
    Number of pages177
    Publication statusPublished - 2013

    Fingerprint

    Dive into the research topics of 'Synthesis and Validation of Vision Based Spacecraft Navigation'. Together they form a unique fingerprint.

    Cite this