Releases: nshepperd/flash_attn_jax
Releases · nshepperd/flash_attn_jax
v0.4.1
20 Aug 08:19
Compare
Sorry, something went wrong.
No results found
I think we can just use pytorch's builder image, which seems trustwor…
v0.3.0
28 Jul 04:59
Compare
Sorry, something went wrong.
No results found
Merge branch 'cmake'.
Switches the build system to cmake, which is more flexible, and fixes up the compatibility for jax 0.5.0--0.7.0.
v0.3.0a
27 Jul 07:57
Compare
Sorry, something went wrong.
No results found
Clean up imports in flash.py, flash_sharding.py, and ring_attention.py.
v0.2.2
24 May 10:57
Compare
Sorry, something went wrong.
No results found
Build cuda12 version with Hopper support.
v0.2.1
21 May 12:29
Compare
Sorry, something went wrong.
No results found
Bump minor version to 0.2.1.
v0.2.0
01 May 23:46
Compare
Sorry, something went wrong.
No results found
Expanded vmap support for flash_mha. Vmapping q but not k,v reduces t…
v0.1.0a3
18 Mar 14:24
Compare
Sorry, something went wrong.
No results found
v0.1.0a2
18 Mar 14:22
Compare
Sorry, something went wrong.
No results found
v0.1.0a1
17 Mar 14:13
Compare
Sorry, something went wrong.
No results found
Set CUDA_HOME for the sdist
v0.1.0
17 Mar 12:16
Compare
Sorry, something went wrong.
No results found
Try release with github actions and rebased repo.