Probabilistic Spatial Transformer Networks

Pola Elisabeth Schwöbel, Frederik Rahbæk Warburg, Martin Jørgensen, Kristoffer Hougaard Madsen, Søren Hauberg

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

93 Downloads (Pure)

Abstract

Spatial Transformer Networks (STNs) estimate image transformations that can improve downstream tasks by ‘zooming in’ on relevant regions in an image. However, STNs are hard to train and sensitive to mis-predictions of transformations. To circumvent these limitations, we propose a probabilistic extension that estimates a stochastic transformation rather than a deterministic one. Marginalizing transformations allows us to consider each image at multiple poses, which makes the localization task easier and the training more robust. As an additional benefit, the stochastic transformations act as a localized, learned data augmentation that improves the downstream tasks. We show across standard imaging benchmarks and on a challenging real-world dataset that these two properties lead to improved classification performance, robustness and model calibration. We further demonstrate that the approach generalizes to non-visual domains by improving model performance on time-series data.
Original languageEnglish
Title of host publicationProceedings of 38th Conference on Uncertainty in Artificial Intelligence
Publication date2022
Pages1749-1759
Publication statusPublished - 2022
Event38th Conference on Uncertainty in Artificial Intelligence - Eindhoven University of Technology, Eindhoven, Netherlands
Duration: 1 Aug 20225 Aug 2022

Conference

Conference38th Conference on Uncertainty in Artificial Intelligence
LocationEindhoven University of Technology
Country/TerritoryNetherlands
CityEindhoven
Period01/08/202205/08/2022
SeriesProceedings of Machine Learning Research
Volume180
ISSN2640-3498

Fingerprint

Dive into the research topics of 'Probabilistic Spatial Transformer Networks'. Together they form a unique fingerprint.

Cite this