Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Need help for roberta-large configurations on NER task #80

Open
KalmanWang opened this issue Dec 19, 2024 · 0 comments
Open

Need help for roberta-large configurations on NER task #80

KalmanWang opened this issue Dec 19, 2024 · 0 comments

Comments

@KalmanWang
Copy link

I can reproduce the result on conll04 with bert-large-uncased, which F1 is as high as 84.5%. however, it only reaches 3% when I try to run the task with roberta-large. It seems that I can't figure it out by myself.
Could provide any ideas for the unexpected result? Thanks for your kindness in advance.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant