Hierarchical Few-Shot Generative Models

Giorgio Giannone, Ole Winther

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

91 Downloads (Pure)

Abstract

A few-shot generative model should be able to generate data from a distribution by only observing a limited set of examples. In few-shot learning the model is trained on data from many sets from different distributions sharing some underlying properties such as sets of characters from different alphabets or sets of images of different type objects. We study a latent variables approach that extends the Neural Statistician [8] to a fully hierarchical approach with an attention-based point to set-level aggregation. We extend the previous work to iterative data sampling, likelihood-based model comparison, and adaptation-free out of distribution generalization. Our results show that the hierarchical formulation better captures the intrinsic variability within the sets in the small data regime. With this work we generalize deep latent variable approaches to few-shot learning, taking a step towards large-scale few-shot generation with a formulation that readily can work with current state-of-the-art deep generative models.
Original languageEnglish
Title of host publicationProceedings of 5th Workshop on Meta-Learning at NeurIPS 2021
Number of pages24
Publication date2022
Publication statusPublished - 2022
Event5th Workshop on Meta-Learning at NeurIPS 2021 - Virtual Event
Duration: 13 Dec 202113 Dec 2021
Conference number: 5

Workshop

Workshop5th Workshop on Meta-Learning at NeurIPS 2021
Number5
LocationVirtual Event
Period13/12/202113/12/2021

Fingerprint

Dive into the research topics of 'Hierarchical Few-Shot Generative Models'. Together they form a unique fingerprint.

Cite this