dq.gradient.ForwardAutograd
ForwardAutograd()
Forward-mode automatic differentiation.
Enables support for forward-mode automatic differentiation
(like jax.jvp
or jax.jacfwd
).
Note
This is the most efficient when the function has more outputs than inputs. For instance, it's the preferred method when simulating a Lindbladian parameterized with a few values and computing the Jacobian of a function returning the expectation values of many observables (or the same observable at many different times).
Warning
This cannot be backward-mode autodifferentiated (e.g. using
jax.jacrev
).
Try using
dq.gradient.CheckpointAutograd
if that
is something you need.
Warning
By default
jax.grad
uses reverse mode. Use jax.jacfwd
to compute the gradient in forward mode.
Note
For Diffrax-based methods, this falls back to the
diffrax.ForwardMode
option.