Skip to content

The source code of the paper "A Generative Model for Joint Natural Language Understanding and Generation" published at ACL 2020.

Notifications You must be signed in to change notification settings

andy194673/Joint-NLU-NLG

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

A Generative Model for Joint Natural Language Understanding and Generation

The source code of the paper A Generative Model for Joint Natural Language Understanding and Generation published at ACL 2020.

@inproceedings{tseng2020generative,
  title={A Generative Model for Joint Natural Language Understanding and Generation},
  author={Tseng, Bo-Hsiang and Cheng, Jianpeng and Fang, Yimai and Vandyke, David},
  booktitle={Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics},
  pages={1795--1807},
  year={2020}
}

Requirements

python 3
torch 1.1.0
numpy 1.13.3
nltk 3.4.5

Data

The two used dataset are in data/ folder with different amounts of training data.

Training

Before training, please create the folders with the following command:

bash create_folders.sh

To train the model, use the script train.sh with the following command:

bash train.sh $dataset $data_ratio $batch_size $model_dimension $seed
  • dataset: e2e or weather
  • data ratio: 5 / 10 / 25 / 50 / 100
  • batch size: 32 / 64
  • model dimension: 150 / 300

Testing

To test the model, use the script test.sh with the following command:

bash test.sh $dataset $data_ratio $batch_size $model_dimension $model_name
  • dataset: e2e or weather
  • data ratio: 5 / 10 / 25 / 50 / 100
  • batch size: 32 / 64
  • model dimension: 150 / 300
  • model name: name of a trained model

About

The source code of the paper "A Generative Model for Joint Natural Language Understanding and Generation" published at ACL 2020.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published