Is it now possible with Pytorch 1.8 to compute vectorized Jacobian vector products? I know in the past it wasn’t (e.g., Jacobian functional API batch-respecting Jacobian), but I am wondering if that has changed with Pytorch 1.8 through some implementation of vmap.

In particular, I am referring to the following two use cases:

- Calculating the Jacobian vector products Jv_i for i=1,…, N, where J is the Jacobian of a function f at a point x and v_i are a set of vectors.

I am wondering if this is how the new vectorized torch.autograd.functional.jacobian/hessian are implemented (with $v_i$ being the standard basis), and if so is there a way to do this for any general set of vectors $v_i$? Related in particular to Add `vectorize` flag to torch.autograd.functional.{jacobian, hessian} by zou3519 · Pull Request #50915 · pytorch/pytorch · GitHub

- Calculating the Jacobian vector products J_i v_i for i = 1, …, N, where J_i is the Jacobian of a function f at a point x_i (the difference vs. 1 is now also calculating the Jacobian over a batch of different inputs x_i)

Thanks!