Bayesian Averaging is Well-Temperated

Lars Kai Hansen, S. Solla . et al (Editor)

    Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

    52 Downloads (Pure)

    Abstract

    Bayesian predictions are stochastic just like predictions of any other inference scheme that generalize from a finite sample. While a simple variational argument shows that Bayes averaging is generalization optimal given that the prior matches the teacher parameter distribution the situation is less clear if the teacher distribution is unknown. I define a class of averaging procedures, the temperated likelihoods, including both Bayes averaging with a uniform prior and maximum likelihood estimation as special cases. I show that Bayes is generalization optimal in this family for any teacher distribution for two learning problems that are analytically tractable learning the mean of a Gaussian and asymptotics of smooth learners.
    Original languageEnglish
    Title of host publicationAdvances in Neural Information Processing Systems 1999
    PublisherMIT Press
    Publication date2000
    Pages265-271
    Publication statusPublished - 2000
    EventAdvances in Neural Information Processing Systems 1999 -
    Duration: 1 Jan 2000 → …

    Conference

    ConferenceAdvances in Neural Information Processing Systems 1999
    Period01/01/2000 → …

    Fingerprint Dive into the research topics of 'Bayesian Averaging is Well-Temperated'. Together they form a unique fingerprint.

    Cite this