Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Training using HF Transformers #156

Open
NiteshMethani opened this issue Sep 7, 2022 · 0 comments
Open

Training using HF Transformers #156

NiteshMethani opened this issue Sep 7, 2022 · 0 comments

Comments

@NiteshMethani
Copy link

Hi authors,
Thank you sharing this interesting piece of work.
I was trying this model on custom NER dataset and compare it with other BERT variants. To that end, I was wondering if you could provide instructions on how to finetune this model on custom NER dataset and what should be the dataset format.

Also, instructions on pre-training the base model (without any head) using unlabled corpus would be really really useful. I saw some instructions around pre-training using allennlp library but there is some friction there. Since HF now is fairly stable library and widely popular, would appreciate if you could provide instructions on using LUKE using HF.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant