Probably Almost Bayes Decisions

S. Anoulova, Paul Fischer, S. Poelt, H.- U. Simon

    Research output: Contribution to journalJournal articleResearchpeer-review


    In this paper, we investigate the problem of classifying objects which are given by feature vectors with Boolean entries. Our aim is to "(efficiently) learn probably almost optimal classifications" from examples. A classical approach in pattern recognition uses empirical estimations of the Bayesian discriminant functions for this purpose. We analyze this approach for different classes of distribution functions of Boolean features:kth order Bahadur-Lazarsfeld expansions andkth order Chow expansions. In both cases, we obtain upper bounds for the required sample size which are small polynomials in the relevant parameters and which match the lower bounds known for these classes. Moreover, the learning algorithms are efficient.
    Original languageEnglish
    JournalInformation and Computation
    Issue number1
    Pages (from-to)63-71
    Publication statusPublished - 1996


    Dive into the research topics of 'Probably Almost Bayes Decisions'. Together they form a unique fingerprint.

    Cite this