Releases: clementpoiret/Equimo
v0.3.5 - Fix incomplete pypi package
Full Changelog: v0.3.3...v0.3.5
v0.3.3 - DyT and bug fixes
What's Changed
- Add support for DyT by @clementpoiret in #11
- feat(vit): allow passing kwargs to blocks by @clementpoiret in #12
Full Changelog: v0.3.0...v0.3.3
v0.3.0 - Experimental text embedding support
This release adds support for text embedding support. It adds a text-compatible transformer and a tokenizer (from tensorflow_text). This enables many workflows when working on both images and text.
The first compatible model is TIPS. SigLIP2 text will come later.
What's Changed
- Experimental text embedding support by @clementpoiret in #10
Full Changelog: v0.2.4...v0.3.0
v0.2.4 - Add support for TIPS pretrained weights
What's Changed
- feat(tips): add pretrained weights by @clementpoiret in #9
Full Changelog: v0.2.3...v0.2.4
v0.2.3 - Support Pretrained SigLIP 2
What's Changed
- Add support for SigLIP 2 without NaFlex by @clementpoiret in #8
- Add support for loading torch models from timm
- Add support for serializing act functions (required to load some SigLIP models)
- Allows passing kwargs to load_model to override saved metadata
Full Changelog: v0.2.2...v0.2.3
v0.2.2 - Minor bug fix
v0.2.1 - Fix Dinov2 Giant SwiGLU
Dinov2 Giant can now be loaded correctly using:
from equimo.io import load_model
model = load_model(cls="vit", identifier="dinov2_vitg14_reg")
What's Changed
- feat(vit): add support for loading dinov2 giant by @clementpoiret in #5
Full Changelog: v0.2.0...v0.2.1
v0.2.0 - Support pretrained models
Equimo now supports converting and loading weights from PyTorch Hub!
Currently, only dinov2 can be loaded.
What's Changed
- refactor(dropout)!: inference by @clementpoiret in #3
- Add support for weight conversion from PyTorch Hub by @clementpoiret in #4
- Bug fixes
New Contributors
- @clementpoiret made their first contribution in #3
Full Changelog: v0.1.2...v0.2.0
v0.1.2 - VSSD norm fix
Full Changelog: v0.1.1...v0.1.2
v0.1.1 - Uniform `feature()` API in model's classes
This release ensure all models consistently have a feature()
method which doesn't call self.norm
and self.head
on x
.
Full Changelog: v0.1.0...v0.1.1