From 2905390a82929a196711251ac79f2b2d15b744fe Mon Sep 17 00:00:00 2001 From: Patrick Kidger <33688385+patrick-kidger@users.noreply.github.com> Date: Sat, 20 Apr 2024 10:56:47 +0200 Subject: [PATCH] Updated ecosystem list --- README.md | 26 +++++++++++++++----------- docs/index.md | 26 +++++++++++++++++++++----- 2 files changed, 36 insertions(+), 16 deletions(-) diff --git a/README.md b/README.md index 05b0e1ae..3974b60e 100644 --- a/README.md +++ b/README.md @@ -81,28 +81,32 @@ If you found this library to be useful in academic work, then please cite: ([arX ## See also: other libraries in the JAX ecosystem -[jaxtyping](https://github.com/google/jaxtyping): type annotations for shape/dtype of arrays. +#### Always useful + +[jaxtyping](https://github.com/patrick-kidger/jaxtyping): type annotations for shape/dtype of arrays. + +#### Deep learning [Optax](https://github.com/deepmind/optax): first-order gradient (SGD, Adam, ...) optimisers. -[Diffrax](https://github.com/patrick-kidger/diffrax): numerical differential equation solvers. +[Orbax](https://github.com/google/orbax): checkpointing (async/multi-host/multi-device). -[Optimistix](https://github.com/patrick-kidger/optimistix): root finding, minimisation, fixed points, and least squares. +[Levanter](https://github.com/stanford-crfm/levanter): scalable+reliable training of foundation models (e.g. LLMs). -[Lineax](https://github.com/google/lineax): linear solvers. +#### Scientific computing -[BlackJAX](https://github.com/blackjax-devs/blackjax): probabilistic+Bayesian sampling. +[Diffrax](https://github.com/patrick-kidger/diffrax): numerical differential equation solvers. -[Orbax](https://github.com/google/orbax): checkpointing (async/multi-host/multi-device). +[Optimistix](https://github.com/patrick-kidger/optimistix): root finding, minimisation, fixed points, and least squares. -[sympy2jax](https://github.com/google/sympy2jax): SymPy<->JAX conversion; train symbolic expressions via gradient descent. +[Lineax](https://github.com/patrick-kidger/lineax): linear solvers. -[Eqxvision](https://github.com/paganpasta/eqxvision): computer vision models. +[BlackJAX](https://github.com/blackjax-devs/blackjax): probabilistic+Bayesian sampling. -[Levanter](https://github.com/stanford-crfm/levanter): scalable+reliable training of foundation models (e.g. LLMs). +[sympy2jax](https://github.com/patrick-kidger/sympy2jax): SymPy<->JAX conversion; train symbolic expressions via gradient descent. [PySR](https://github.com/milesCranmer/PySR): symbolic regression. (Non-JAX honourable mention!) -## Disclaimer +#### Awesome JAX -Equinox is maintained by Patrick Kidger at Google X, but this is not an official Google product. +[Awesome JAX](https://github.com/n2cholas/awesome-jax): a longer list of other JAX projects. diff --git a/docs/index.md b/docs/index.md index 9a6ab1f8..f1d65cb6 100644 --- a/docs/index.md +++ b/docs/index.md @@ -70,16 +70,32 @@ If this quick start has got you interested, then have a read of [All of Equinox] ## See also: other libraries in the JAX ecosystem +#### Always useful + +[jaxtyping](https://github.com/patrick-kidger/jaxtyping): type annotations for shape/dtype of arrays. + +#### Deep learning + [Optax](https://github.com/deepmind/optax): first-order gradient (SGD, Adam, ...) optimisers. +[Orbax](https://github.com/google/orbax): checkpointing (async/multi-host/multi-device). + +[Levanter](https://github.com/stanford-crfm/levanter): scalable+reliable training of foundation models (e.g. LLMs). + +#### Scientific computing + [Diffrax](https://github.com/patrick-kidger/diffrax): numerical differential equation solvers. -[Lineax](https://github.com/google/lineax): linear solvers and linear least squares. +[Optimistix](https://github.com/patrick-kidger/optimistix): root finding, minimisation, fixed points, and least squares. -[jaxtyping](https://github.com/google/jaxtyping): type annotations for shape/dtype of arrays. +[Lineax](https://github.com/patrick-kidger/lineax): linear solvers. -[Eqxvision](https://github.com/paganpasta/eqxvision): computer vision models. +[BlackJAX](https://github.com/blackjax-devs/blackjax): probabilistic+Bayesian sampling. -[sympy2jax](https://github.com/google/sympy2jax): SymPy<->JAX conversion; train symbolic expressions via gradient descent. +[sympy2jax](https://github.com/patrick-kidger/sympy2jax): SymPy<->JAX conversion; train symbolic expressions via gradient descent. -[Levanter](https://github.com/stanford-crfm/levanter): scalable+reliable training of foundation models (e.g. LLMs). +[PySR](https://github.com/milesCranmer/PySR): symbolic regression. (Non-JAX honourable mention!) + +#### Awesome JAX + +[Awesome JAX](https://github.com/n2cholas/awesome-jax): a longer list of other JAX projects.