Skip to content

Which Transformer Model can be used in sapcy with token size more than 512 to train long sentences for NER + RELATION problem? #9276

Discussion options

You must be logged in to vote

By default the transformer component uses overlapping strided spans (see: https://spacy.io/api/transformer#span_getters) so you can train and predict on longer texts without issues on transformer models that have a fixed max length.

Replies: 1 comment 2 replies

Comment options

You must be logged in to vote
2 replies
@karndeepsingh
Comment options

@adrianeboyd
Comment options

Answer selected by svlandeg
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feat / transformer Feature: Transformer
2 participants