-
Notifications
You must be signed in to change notification settings - Fork 1
Latest embeddings techniques (Infinite Context Length?) #14
Copy link
Copy link
Open
Labels
enhancementNew feature or requestNew feature or request
Description
Rotary embeddings, key query embeddings, or maybe an adapted ALiBi embedding all seem worth implementing to improve the performance of LEAP. Ideally this would be implemented as an option that can be passed in the config like BERT's position_embedding_type (link below) maybe even the same code could be used!
This should also enable Infinite Context Length, though constant memory gradient calculation (see #12) would be needed to really allow for Infinite Context Length.
https://huggingface.co/docs/transformers/v4.21.1/en/model_doc/bert#transformers.BertConfig
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or request