Which Transformer Model can be used in sapcy with token size more than 512 to train long sentences for NER + RELATION problem? #9276
-
Hi, Thanks |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 2 replies
-
By default the |
Beta Was this translation helpful? Give feedback.
By default the
transformer
component uses overlapping strided spans (see: https://spacy.io/api/transformer#span_getters) so you can train and predict on longer texts without issues on transformer models that have a fixed max length.