On design and evaluation of tapped-delay neural network architectures

Claus Svarer, Lars Kai Hansen, Jan Larsen

    Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

    643 Downloads (Pure)

    Abstract

    Pruning and evaluation of tapped-delay neural networks for the sunspot benchmark series are addressed. It is shown that the generalization ability of the networks can be improved by pruning using the optimal brain damage method of Le Cun, Denker and Solla. A stop criterion for the pruning algorithm is formulated using a modified version of Akaike's final prediction error estimate. With the proposed stop criterion, the pruning scheme is shown to produce successful architectures with a high yield
    Original languageEnglish
    Title of host publicationIEEE International Conference on Neural Networks
    VolumeVolume 1
    PublisherIEEE
    Publication date1993
    Pages46-51
    ISBN (Print)07-80-30999-5
    DOIs
    Publication statusPublished - 1993
    Event1993 IEEE International Conference on Neural Networks - San Francisco, CA, United States
    Duration: 28 Mar 19931 Apr 1993
    http://ieeexplore.ieee.org/xpl/mostRecentIssue.jsp?punumber=1059

    Conference

    Conference1993 IEEE International Conference on Neural Networks
    Country/TerritoryUnited States
    CitySan Francisco, CA
    Period28/03/199301/04/1993
    Internet address

    Bibliographical note

    Copyright: 1993 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE

    Fingerprint

    Dive into the research topics of 'On design and evaluation of tapped-delay neural network architectures'. Together they form a unique fingerprint.

    Cite this