Abstract
This paper addresses the assessment of generalization performance of neural network models by use of empirical techniques. We suggest to use the cross-validation scheme combined with a resampling technique to obtain an estimate of the generalization performance distribution of a specific model. This enables the formulation of a bulk of new generalization performance measures. Numerical results demonstrate the viability of the approach compared to the standard technique of using algebraic estimates like the FPE. Moreover, we consider the problem of comparing the generalization performance of different competing models. Since all models are trained on the same data, a key issue is to take this dependency into account. The optimal split of the data set of size N into a cross-validation set of size Nγ and a training set of size N(1-γ) is discussed. Asymptotically (large data sees), γopt→1 such that a relatively larger amount is left for validation
Original language | English |
---|---|
Title of host publication | Proceedings of the 1995 IEEE Workshop on Neural Networks for Signal Processing |
Publisher | IEEE |
Publication date | 1995 |
Pages | 30-39 |
ISBN (Print) | 07-80-32739-X |
DOIs | |
Publication status | Published - 1995 |
Event | 1995 IEEE Workshop on Neural Networks for Signal Processing - Cambridge, United States Duration: 31 Aug 1995 → 2 Sept 1995 Conference number: 5 https://ieeexplore.ieee.org/xpl/conhome/3947/proceeding http://ieeexplore.ieee.org/xpl/mostRecentIssue.jsp?punumber=3947 |
Conference
Conference | 1995 IEEE Workshop on Neural Networks for Signal Processing |
---|---|
Number | 5 |
Country/Territory | United States |
City | Cambridge |
Period | 31/08/1995 → 02/09/1995 |
Internet address |