You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, thanks for the work!
I'm having trouble running your example to reproduce the checkpoint result using this command: python examples/ner/evaluate_transformers_checkpoint.py data/ner_conll/en/test.txt studio-ousia/luke-large-finetuned-conll-2003 --cuda-device 0
It gave me this error:
File "/usr/local/lib/python3.10/dist-packages/seqeval/scheme.py", line 55, in __init__
self.prefix = Prefixes[token[-1]] if suffix else Prefixes[token[0]]
KeyError: 'r'
I wonder what's wrong here? Possbily my version of seqeval didn't match yours? It's not listed in your requirements.txt
Also could you give a finetune example using huggingface? I'm aware of an example they gave here https://github.com/huggingface/transformers/tree/main/examples/research_projects/luke but its' quite bad on Conll2003 (0.5 F1 score)
The text was updated successfully, but these errors were encountered:
Hi ryokan, I see it now, thanks for answering. As for the second part of my question. Do you have any ideas why it is the case that the notebook only achieve an F1 of 0.5?
I am not familiar with the implementation but I think that some format issues cause the bad performance.
A common pitfall is that there are multiple formats for NER such as BIO, IOB, IOB2...
If any of the model outputs, evaluation data and the evaluation script mismatch, it could lead to unexpected results.
For example, out script assues iob1 format by default.
Hello, thanks for the work!
I'm having trouble running your example to reproduce the checkpoint result using this command:
python examples/ner/evaluate_transformers_checkpoint.py data/ner_conll/en/test.txt studio-ousia/luke-large-finetuned-conll-2003 --cuda-device 0
It gave me this error:
I wonder what's wrong here? Possbily my version of seqeval didn't match yours? It's not listed in your requirements.txt
Also could you give a finetune example using huggingface? I'm aware of an example they gave here https://github.com/huggingface/transformers/tree/main/examples/research_projects/luke but its' quite bad on Conll2003 (0.5 F1 score)
The text was updated successfully, but these errors were encountered: