Skip to content

Use generation probability of pre-trained language models to model commonsense reasoning.

Notifications You must be signed in to change notification settings

heyLinsir/LM-for-CommonsenseReasoning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LM-for-CommonsenseReasoning

Use generation probability of pre-trained language models to model commonsense reasoning.

Unofficial implementation of the paper Pre-training Is (Almost) All You Need: An Application to Commonsense Reasoning.

Accuracy on COPA test set: 1-gram premise-LM (RoBERTa-large)

No finetuning: 75.0

Finetune with margin loss (margin=0.5): 91.6

About

Use generation probability of pre-trained language models to model commonsense reasoning.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published