Abstract
A few-shot generative model should be able to generate data from a distribution by only observing a limited set of examples. In few-shot learning the model is trained on data from many sets from different distributions sharing some underlying properties such as sets of characters from different alphabets or sets of images of different type objects. We study a latent variables approach that extends the Neural Statistician [8] to a fully hierarchical approach with an attention-based point to set-level aggregation. We extend the previous work to iterative data sampling, likelihood-based model comparison, and adaptation-free out of distribution generalization. Our results show that the hierarchical formulation better captures the intrinsic variability within the sets in the small data regime. With this work we generalize deep latent variable approaches to few-shot learning, taking a step towards large-scale few-shot generation with a formulation that readily can work with current state-of-the-art deep generative models.
Original language | English |
---|---|
Title of host publication | Proceedings of 5th Workshop on Meta-Learning at NeurIPS 2021 |
Number of pages | 24 |
Publication date | 2022 |
Publication status | Published - 2022 |
Event | 5th Workshop on Meta-Learning at NeurIPS 2021 - Virtual Event Duration: 13 Dec 2021 → 13 Dec 2021 Conference number: 5 |
Workshop
Workshop | 5th Workshop on Meta-Learning at NeurIPS 2021 |
---|---|
Number | 5 |
Location | Virtual Event |
Period | 13/12/2021 → 13/12/2021 |