## Abstract

Architecture optimization is a fundamental problem of neural network modeling. The optimal architecture is defined as the one which minimizes the generalization error. This paper addresses estimation of the generalization performance of regularized, complete neural network models. Regularization normally improves the generalization performance by restricting the model complexity. A formula for the optimal weight decay regularizer is derived. A regularized model may be characterized by an effective number of weights (parameters); however, it is demonstrated that no simple definition is possible. A novel estimator of the average generalization error (called FPER) is suggested and compared to the final prediction error (FPE) and generalized prediction error (GPE) estimators. In addition, comparative numerical studies demonstrate the qualities of the suggested estimator

Original language | English |
---|---|

Title of host publication | Proceedings of the 4th IEEE Workshop Neural Networks for Signal Processing |

Publisher | IEEE |

Publication date | 1994 |

Pages | 42-51 |

ISBN (Print) | 07-80-32026-3 |

DOIs | |

Publication status | Published - 1994 |

Event | IEEE Workshop of Neural Networks for Signal Proceesing IV - Ermioni, Greece Duration: 6 Sep 1994 → 8 Sep 1994 Conference number: 4th http://ieeexplore.ieee.org/xpl/mostRecentIssue.jsp?punumber=2959 |

### Workshop

Workshop | IEEE Workshop of Neural Networks for Signal Proceesing IV |
---|---|

Number | 4th |

Country | Greece |

City | Ermioni |

Period | 06/09/1994 → 08/09/1994 |

Internet address |