On the use of a pruning prior for neural networks

Cyril Goutte

    Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

    503 Downloads (Pure)

    Abstract

    We address the problem of using a regularization prior that prunes unnecessary weights in a neural network architecture. This prior provides a convenient alternative to traditional weight-decay. Two examples are studied to support this method and illustrate its use. First we use the sunspots benchmark problem as an example of time series processing. Then we address the problem of system identification on a small artificial system
    Original languageEnglish
    Title of host publicationProceedings of the IEEE Signal Processing Society Workshop Neural Networks for Signal Processing
    PublisherIEEE
    Publication date1996
    Pages52-61
    ISBN (Print)07-80-33550-3
    DOIs
    Publication statusPublished - 1996
    Event1996 IEEE Workshop on Neural Networks for Signal Processing VI - Kyoto, Japan
    Duration: 4 Sept 19966 Sept 1996
    Conference number: 6
    http://ieeexplore.ieee.org/xpl/mostRecentIssue.jsp?punumber=3974

    Workshop

    Workshop1996 IEEE Workshop on Neural Networks for Signal Processing VI
    Number6
    Country/TerritoryJapan
    CityKyoto
    Period04/09/199606/09/1996
    Internet address

    Bibliographical note

    Copyright: 1996 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE

    Fingerprint

    Dive into the research topics of 'On the use of a pruning prior for neural networks'. Together they form a unique fingerprint.

    Cite this