14

I would like to use automatic differentiation to calculate gradients to function written in numpy.

I've come across a number of packages, including

But none of them seem to support things like numba and numexpr, which I'd normally use to accelerate my python code.

What packages do people use for this?

p.s. I know there's also stuff like tensorflow and pytorch, but I would like to keep my code in numpy

user357269
  • 363
  • 1
  • 8

1 Answers1

4

Jax has the features you're looking for. See https://jax.readthedocs.io/en/latest/notebooks/quickstart.html

Brannon
  • 148
  • 5
  • 1
    To the downvoter, why the downvote? – Mark L. Stone Oct 28 '21 at 00:14
  • 1
    Not the downvoter, but generally answers with just a link are discouraged. You should give some description of what is in the link, even if its just a quotation. This makes the answer here more useful and avoids a situation where a link eventually dies, leaving no information in the answer. – Tyberius Nov 01 '21 at 14:54