Differential Geometric Approaches to Machine Learning

Alison Marie Sandrine Pouplin

Research output: Book/ReportPh.D. thesis

1710 Downloads (Pure)

Abstract

Differential geometry is a branch of mathematics that focuses on the study of manifolds and their properties, through the use of calculus and algebra. In machine learning, various high-dimensional spaces, including data space and parameter space, can be effectively treated as manifolds. Gaining insights into the structure of these manifolds is crucial for addressing numerous machine learning challenges. This thesis demonstrates how differential geometry can be applied to tackle machine learning problems. It is divided into two parts, with the first part providing a general introduction to three subfields of differential geometry, and the second part highlighting the research conducted during this thesis. The first chapter delves into Riemannian geometry, offering an intuitive understanding of Riemannian objects and curvature concepts. This chapter lays the foundation for further investigation into the curvature of loss landscapes in the context of neural network generalization. The second chapter sheds light on Finsler geometry, drawing comparisons with Riemannian geometry. Finsler geometry has proven valuable in comparing expected lengths derived from a stochastic manifold. The third chapter introduces the Fisher-Rao metric in information geometry. This metric is employed to explore the latent space of Variational Auto-Encoders decoding to various distributions.
Original languageEnglish
PublisherTechnical University of Denmark
Number of pages156
Publication statusPublished - 2023

Fingerprint

Dive into the research topics of 'Differential Geometric Approaches to Machine Learning'. Together they form a unique fingerprint.

Cite this