Non-Linear Back-propagation: Doing Back-Propagation withoutDerivatives of the Activation Function

John Hertz, Anders Stærmose Krogh, Benny Lautrup, Torsten Lehmann

    Research output: Contribution to journalJournal articleResearchpeer-review


    The conventional linear back-propagation algorithm is replaced by a non-linear version, which avoids the necessity for calculating the derivative of the activation function. This may be exploited in hardware realizations of neural processors. In this paper we derive the non-linear back-propagation algorithms in the framework of recurrent back-propagation and present some numerical simulations of feed-forward networks on the NetTalk problem. A discussion of implementation in analog VLSI electronics concludes the paper.
    Original languageEnglish
    JournalI E E E Transactions on Neural Networks
    Issue number6
    Pages (from-to)1321-1327
    Publication statusPublished - 1997

    Cite this