# Newton step methods for AD of an objective defined using implicit functions

Research output: Contribution to journalJournal articleResearchpeer-review

### Abstract

We consider the problem of computing derivatives of an objective that is defined using implicit functions; i.e., implicit variables are computed by solving equations that are often nonlinear and solved by an iterative process. If one were to apply Algorithmic Differentiation (AD) directly, one would differentiate the iterative process. In this paper we present the Newton step methods for computing derivatives of the objective. These methods make it easy to take advantage of sparsity, forward mode, reverse mode, and other AD techniques. We prove that the partial Newton step method works if the number of steps is equal to the order of the derivatives. The full Newton step method obtains two derivatives order for each step except for the first step. There are alternative methods that avoid differentiating the iterative process; e.g., the method implemented in ADOL-C. An optimal control example demonstrates the advantage of the Newton step methods when computing both gradients and Hessians. We also discuss the Laplace approximation method for nonlinear mixed effects models as an example application.
Original language English Optimization Methods and Software 33 4-6 907-923 1055-6788 https://doi.org/10.1080/10556788.2017.1406936 Published - 2018

### Keywords

• Implicit functions
• Automatic derivatives
• Higher order dervatives
• Newton step
• Optimal control
• Nonlinear mixed effects

### Cite this

@article{4a0f7f9b7eae49e683de768ae78b9fd1,
title = "Newton step methods for AD of an objective defined using implicit functions",
abstract = "We consider the problem of computing derivatives of an objective that is defined using implicit functions; i.e., implicit variables are computed by solving equations that are often nonlinear and solved by an iterative process. If one were to apply Algorithmic Differentiation (AD) directly, one would differentiate the iterative process. In this paper we present the Newton step methods for computing derivatives of the objective. These methods make it easy to take advantage of sparsity, forward mode, reverse mode, and other AD techniques. We prove that the partial Newton step method works if the number of steps is equal to the order of the derivatives. The full Newton step method obtains two derivatives order for each step except for the first step. There are alternative methods that avoid differentiating the iterative process; e.g., the method implemented in ADOL-C. An optimal control example demonstrates the advantage of the Newton step methods when computing both gradients and Hessians. We also discuss the Laplace approximation method for nonlinear mixed effects models as an example application.",
keywords = "Implicit functions, Automatic derivatives, Higher order dervatives, Newton step, Optimal control, Nonlinear mixed effects",
author = "Bell, {Bradley M.} and Kasper Kristensen",
year = "2018",
doi = "10.1080/10556788.2017.1406936",
language = "English",
volume = "33",
pages = "907--923",
journal = "Optimization Methods and Software",
issn = "1055-6788",
publisher = "CRC Press/Balkema",
number = "4-6",

}

In: Optimization Methods and Software, Vol. 33, No. 4-6, 2018, p. 907-923.

Research output: Contribution to journalJournal articleResearchpeer-review

TY - JOUR

T1 - Newton step methods for AD of an objective defined using implicit functions

AU - Kristensen, Kasper

PY - 2018

Y1 - 2018

N2 - We consider the problem of computing derivatives of an objective that is defined using implicit functions; i.e., implicit variables are computed by solving equations that are often nonlinear and solved by an iterative process. If one were to apply Algorithmic Differentiation (AD) directly, one would differentiate the iterative process. In this paper we present the Newton step methods for computing derivatives of the objective. These methods make it easy to take advantage of sparsity, forward mode, reverse mode, and other AD techniques. We prove that the partial Newton step method works if the number of steps is equal to the order of the derivatives. The full Newton step method obtains two derivatives order for each step except for the first step. There are alternative methods that avoid differentiating the iterative process; e.g., the method implemented in ADOL-C. An optimal control example demonstrates the advantage of the Newton step methods when computing both gradients and Hessians. We also discuss the Laplace approximation method for nonlinear mixed effects models as an example application.

AB - We consider the problem of computing derivatives of an objective that is defined using implicit functions; i.e., implicit variables are computed by solving equations that are often nonlinear and solved by an iterative process. If one were to apply Algorithmic Differentiation (AD) directly, one would differentiate the iterative process. In this paper we present the Newton step methods for computing derivatives of the objective. These methods make it easy to take advantage of sparsity, forward mode, reverse mode, and other AD techniques. We prove that the partial Newton step method works if the number of steps is equal to the order of the derivatives. The full Newton step method obtains two derivatives order for each step except for the first step. There are alternative methods that avoid differentiating the iterative process; e.g., the method implemented in ADOL-C. An optimal control example demonstrates the advantage of the Newton step methods when computing both gradients and Hessians. We also discuss the Laplace approximation method for nonlinear mixed effects models as an example application.

KW - Implicit functions

KW - Automatic derivatives

KW - Higher order dervatives

KW - Newton step

KW - Optimal control

KW - Nonlinear mixed effects

U2 - 10.1080/10556788.2017.1406936

DO - 10.1080/10556788.2017.1406936

M3 - Journal article

VL - 33

SP - 907

EP - 923

JO - Optimization Methods and Software

JF - Optimization Methods and Software

SN - 1055-6788

IS - 4-6

ER -