**We are hiring a PhD student to do research on functional high performance computing, including work on Futhark itself. See here for more details.**Source file: reverse-ad.fut

# Reverse-mode automatic differentiation

The built-in `vjp`

function computes the product of the
Jacobian
of a function at some point and a *seed vector*. This can be used
to compute derivatives of functions.

```
def f x = f64.sqrt(x) * f64.sin(x)
def f' x = vjp f x 1
```

`> f' 2f64`

`-0.26703531187166946f64`

`jvp`

corresponds to differentiation by reverse
accumulation,
and is most efficient for functions that have more inputs than
outputs. A particularly important special case of these are loss
functions that return
just a single scalar. For these we can define a function for
finding gradients:

```
def grad f x = vjp f x 1f64
def g (x,y) = f64.cos(x) * f64.sin(y)
def g' x y = grad g (x,y)
```

`> g' 1f64 2f64`

`(-0.7651474012342926f64, -0.2248450953661529f64)`