Behaviour in O of the Neural Networks Training Cost

Cyril Goutte

    Research output: Contribution to journalJournal articleResearchpeer-review


    We study the behaviour in zero of the derivatives of the cost function used when training non-linear neural networks. It is shown that a fair number offirst, second and higher order derivatives vanish in zero, validating the belief that 0 is a peculiar and potentially harmful location. These calculations arerelated to practical and theoretical aspects of neural networks training.
    Original languageEnglish
    JournalNeural Processing Letters
    Issue number2
    Pages (from-to)107-116
    Publication statusPublished - 1998

    Cite this