Abstract
Modeling with flexible models, such as neural networks, requires careful control of the model complexity and generalization ability of the resulting model which finds expression in the ubiquitous bias-variance dilemma. Regularization is a tool for optimizing the model structure reducing variance at the expense of introducing extra bias. The overall objective of adaptive regularization is to tune the amount of regularization ensuring minimal generalization error. Regularization is a supplement to direct model selection techniques like step-wise selection and one would prefer a hybrid scheme; however, a very flexible regularization may substitute the need for selection procedures. This paper investigates recently suggested adaptive regularization schemes. Some methods focus directly on minimizing an estimate of the generalization error (either algebraic or empirical), whereas others start from different criteria, e.g., the Bayesian evidence. The evidence expresses basically the probability of the model, which is conceptually different from generalization error; however, asymptotically for large training data sets they will converge. First the basic model definition, training and generalization is presented. Next, different adaptive regularization schemes are reviewed and extended. Finally, the experimental section presents a comparative study concerning linear models for regression/time series problems.
Original language | English |
---|---|
Title of host publication | Proceedings of the 2000 IEEE Signal Processing Society Workshop |
Volume | 1 |
Place of Publication | Sydney, NSW |
Publisher | IEEE |
Publication date | 2000 |
Pages | 221-230 |
ISBN (Print) | 0-7803-6278-0 |
DOIs | |
Publication status | Published - 2000 |
Event | Neural Networks for Signal Processing X - Sydney, Australia Duration: 11 Dec 2000 → 13 Dec 2000 Conference number: 10 |
Conference
Conference | Neural Networks for Signal Processing X |
---|---|
Number | 10 |
Country/Territory | Australia |
City | Sydney |
Period | 11/12/2000 → 13/12/2000 |