Multi-sensor Data Fusion for Spacecraft Navigation

Lukas Alexander Mads Christensen

Research output: Book/ReportPh.D. thesisResearch

144 Downloads (Pure)


This project investigates applications of sensor fusion for the purposes of spacecraft navigation in noncooperative scenarios. The general focus is on vision based techniques. Specifically, four distinct scenarios are considered and analyzed.

The masses of asteroids or other small bodies in the solar system may be estimated by measuring the disturbances their gravity cause on the trajectory of a nearby spacecraft performing a flyby. However, for lowmass bodies this requires the spacecraft to be in close proximity of the target which can put the spacecraft in danger. A method that allows for accurate mass estimates without endangering the flyby spacecraft is therefore investigated. The specific approach involves ejecting a number of probes prior to the encounter and tracking these from the host spacecraft as they pass by the target. Visual observations are combined with radiometric tracking to simultaneously estimate the trajectories of the target, the spacecraft, and the probes as well as the mass of the target. It is found that the mass can be extracted with better accuracy than what is achievable using conventional methods, with the best precision being obtained when the probes are fitted with internal radio beacons.

When a spacecraft is operating in close proximity to a celestial body, its position may be determined using a combination of Earthbased tracking and observations of the target. In the unfortunate event that the spacecraft suddenly loses communication with the ground, it will need a fast and reliable way of keeping track of its location until normal operation can be resumed. As a potential solution for this, a method for spacecraft positioning based on observations of the horizon and terminator of the target object is investigated. By fitting observations to predictions based on various surface models and combining it with attitude information from a star tracker, it is found that fairly accurate positioning is achievable, especially when multiple cameras are employed. More importantly, the method is fast and reliable.

To perform pinpoint landings on the surface of the Moon or other bodies in the solar system, accurate autonomous positioning is needed. One way to achieve this is by using visual tracking of surface features, potentially combined with data from other types of sensors. For this reason, an efficient method for crater detection and tracking is  developed. After a crater has been detected, data from an inertial measurement unit is used to predict its shape and location in subsequent images. This is then used as the initial estimate for a refinement process that fits the crater features to actual image data. The approach removes the need for explicit feature matching as well as reduces the computational load associated with detecting craters in each image. The resulting algorithm therefore is therefore suited for realtimeterrain relative navigation using spacegrade processors.

By tracking features on the surface of a target, a spacecraft may determine its relative change in pose from observation to observation. However, if the distance to the target is undetermined, the position estimates will have an unknown scale factor. Therefore, a conceptual rangefinder based on a CCD camera and a steerable laser is considered. By using a specific timing sequence, several exposures are stacked in  the transfer register of the CCD before being read out. This allows for high signaltonoise ratio observations of laser pulses without the need for external shuttering. To demonstrate the concept, a prototype is constructed which is capable of using both triangulation and time of flight measurements to perform ranging. The performance and noise characteristics of the device are then evaluated in order to determine the feasibility of the approach, with the results indicating that good precision can be achieved for a wide range of distance. 
Original languageEnglish
Place of PublicationKgs. Lyngby
PublisherTechnical University of Denmark
Number of pages158
Publication statusPublished - 2020


Dive into the research topics of 'Multi-sensor Data Fusion for Spacecraft Navigation'. Together they form a unique fingerprint.

Cite this