Abstract
Architecture optimization is a fundamental problem of neural network modeling. The optimal architecture is defined as the one which minimizes the generalization error. This paper addresses estimation of the generalization performance of regularized, complete neural network models. Regularization normally improves the generalization performance by restricting the model complexity. A formula for the optimal weight decay regularizer is derived. A regularized model may be characterized by an effective number of weights (parameters); however, it is demonstrated that no simple definition is possible. A novel estimator of the average generalization error (called FPER) is suggested and compared to the final prediction error (FPE) and generalized prediction error (GPE) estimators. In addition, comparative numerical studies demonstrate the qualities of the suggested estimator
Original language | English |
---|---|
Title of host publication | Proceedings of the 4th IEEE Workshop Neural Networks for Signal Processing |
Publisher | IEEE |
Publication date | 1994 |
Pages | 42-51 |
ISBN (Print) | 07-80-32026-3 |
DOIs | |
Publication status | Published - 1994 |
Event | 1994 IEEE Workshop on Neural Networks for Signal Processing - Ermoino, Greece Duration: 6 Sept 1994 → 8 Sept 1994 Conference number: 4 https://ieeexplore.ieee.org/xpl/conhome/2959/proceeding http://ieeexplore.ieee.org/xpl/mostRecentIssue.jsp?punumber=2959 |
Conference
Conference | 1994 IEEE Workshop on Neural Networks for Signal Processing |
---|---|
Number | 4 |
Country/Territory | Greece |
City | Ermoino |
Period | 06/09/1994 → 08/09/1994 |
Internet address |