-
Notifications
You must be signed in to change notification settings - Fork 65
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
using another pre-trained BERT model #1
Comments
and what about the data, can I use something like this sulfate 100130 547 554 B-Chem S-Chem Or the data must be exactly similar to conll? |
If you want to use your own model, first you must make sure that the parameter format of the model is the same as that provided by BERT official, otherwise you will encounter some problems. In fact, I provided a url parameter in the LoadCheckpoint class to pass the model you have trained. This parameter needs to be a link, and the data is the same compression format as the official BERT file. In order to facilitate the implementation of multiple tasks based on BERT under the framework, I have changed the input format to "sentence \t sequence_labels" format, which does not support the format you provide. But if you want to use a custom format, you can customize the data load method without changing the model architecture. |
I'll check then Thanks @kyzhouhzau |
Hi
I would like to train a NER (EN) but with my own BERT model, where can I do this.
Thanks
The text was updated successfully, but these errors were encountered: