Skip to content

Releases: nshepperd/flash_attn_jax

v0.4.1

20 Aug 08:19

Choose a tag to compare

I think we can just use pytorch's builder image, which seems trustwor…

v0.3.0

28 Jul 04:59

Choose a tag to compare

Merge branch 'cmake'.

Switches the build system to cmake, which is more flexible, and fixes up the compatibility for jax 0.5.0--0.7.0.

v0.3.0a

27 Jul 07:57

Choose a tag to compare

Clean up imports in flash.py, flash_sharding.py, and ring_attention.py.

v0.2.2

24 May 10:57

Choose a tag to compare

Build cuda12 version with Hopper support.

v0.2.1

21 May 12:29

Choose a tag to compare

Bump minor version to 0.2.1.

v0.2.0

01 May 23:46

Choose a tag to compare

Expanded vmap support for flash_mha. Vmapping q but not k,v reduces t…

v0.1.0a3

18 Mar 14:24

Choose a tag to compare

Try cibuildwheel.

v0.1.0a2

18 Mar 14:22

Choose a tag to compare

Try cibuildwheel.

v0.1.0a1

17 Mar 14:13

Choose a tag to compare

Set CUDA_HOME for the sdist

v0.1.0

17 Mar 12:16

Choose a tag to compare

Try release with github actions and rebased repo.