Abstract
In this paper we propose a method for construction of feed-forward neural classifiers based on regularization and adaptive architectures. Using a penalized maximum likelihood scheme, we derive a modified form of the entropic error measure and an algebraic estimate of the test error. In conjunction with optimal brain damage pruning, a test error estimate is used to select the network architecture. The scheme is evaluated on four classification problems.
| Original language | English |
|---|---|
| Journal | Neural Networks |
| Volume | 11 |
| Issue number | 9 |
| Pages (from-to) | 1659-1670 |
| ISSN | 0893-6080 |
| Publication status | Published - 1998 |