- Using Semantic Similarity as Reward for Reinforcement Learning in Sentence Generation. ACL 2019. [PDF]
- Towards Generating Long and Coherent Text with Multi-Level Latent Variable Models. ACL 2019. [PDF]
- Sentence-Level Content Planning and Style Specification for Neural Text Generation. EMNLP 2019. [PDF]
- Denoising-based Sequence-to-Sequence Pre-training for Text Generation. EMNLP 2019. [PDF]
- A Graph-to-Sequence Model for AMR-to-Text Generation. ACL 2018. [PDF]
- Controlling Global Statistics in Recurrent Neural Network Text Generation. AAAI 2018. [PDF]
- Long Text Generation via Adversarial Training with Leaked Information. AAAI 2018. [PDF]
- Order-Planning Neural Text Generation From Structured Data. AAAI 2018. [PDF]
- Table-to-text Generation by Structure-aware Seq2seq Learning. AAAI 2018. [PDF]
- Diverse and Coherent Paragraph Generation from Images. ECCV 2018. [PDF]
- Differentiated Distribution Recovery for Neural Text Generation. AAAI 2019. [PDF]
- Enhancing Variational Autoencoders with Mutual Information Neural Estimation for Text Generation. ACL 2019. [PDF]
- Data-to-Text Generation with Content Selection and Planning. AAAI 2019. [PDF]
- Hierarchical Encoder with Auxiliary Supervision for Table-to-text Generation: Learning Better Representation for Tables. AAAI 2019. [PDF]
- ParaBank: Monolingual Bitext Generation and Sentential Paraphrasing via Lexicallyconstrained Neural Machine Translation. AAAI 2019. [PDF]
- A Topic Augmented Text Generation Model: Joint Learning of Semantics and Structural Features. EMNLP 2019. [PDF]
- ARAML: A Stable Adversarial Training Framework for Text Generation. EMNLP 2019. [PDF]
- Deep Copycat Networks for Text-to-Text Generation. EMNLP 2019. [PDF]
- Enhancing AMR-to-Text Generation with Dual Graph Representations. EMNLP 2019. [PDF]
- Enhancing Neural Data-To-Text Generation Models with External Background Knowledge. EMNLP 2019. [PDF]
- Implicit Deep Latent Variable Models for Text Generation. EMNLP 2019. [PDF]
- Long and Diverse Text Generation with Planning-based Hierarchical Variational Model. EMNLP 2019. [PDF]
- Modeling Graph Structure in Transformer for Better AMR-to-Text Generation. EMNLP 2019. [PDF]
- MoverScore: Text Generation Evaluating with Contextualized Embeddings and Earth Mover Distance. EMNLP 2019. [PDF]
- Neural data-to-text generation: A comparison between pipeline and end-to-end architectures. EMNLP 2019. [PDF]
- Select and Attend: Towards Controllable Content Selection in Text Generation. EMNLP 2019. [PDF]
- Table-to-Text Generation with Effective Hierarchical Encoder on Three dimensions (Row, Column and Time). EMNLP 2019. [PDF]
- Autoregressive Text Generation beyond Feedback Loops. EMNLP 2019. [PDF]
- A Cross-Domain Transferable Neural Coherence Model. ACL 2019. [PDF]
- Sentence Mover's Similarity Automatic Evaluation for Multi-Sentence Texts. ACL 2019. [PDF]
Discourse-Aware Neural Rewards for Coherent Text Generation In this paper, we investigate the use of discourse-aware rewards with reinforcement learning to guide a model to generate long, coherent text.
Neural Text Generation in Stories Using Entity Representations as Context We introduce an approach to neural text generation that explicitly represents entities mentioned in the text.
A Deep Ensemble Model with Slot Alignment for Sequence-to-Sequence Natural Language Generation We describe an ensemble neural language generator, and present several novel methods for data representation and augmentation that yield improved results in our model.
Natural Answer Generation with Heterogeneous Memory In this work, we propose a novel attention mechanism to encourage the decoder to actively interact with the memory by taking its heterogeneity into account.
Query and Output: Generating Words by Querying Distributed Word Representations for Paraphrase Generation We present a neural model for question generation from knowledge graphs triples in a “Zero-shot” setup, that is generating questions for predicate, subject types or object types that were not seen at training time.
Zero-Shot Question Generation from Knowledge Graphs for Unseen Predicates and Entity Types We present a neural model for question generation from knowledge graphs triples in a “Zero-shot” setup, that is generating questions for predicate, subject types or object types that were not seen at training time.
What’s This Movie About? A Joint Neural Network Architecture for Movie Content Analysis We present a novel end-to-end model for overview generation, consisting of a multi-label encoder for identifying screenplay attributes, and an LSTM decoder to generate natural language sentences conditioned on the identified attributes. We create a dataset that consists of movie scripts, attribute-value pairs for the movies’ aspects, as well as overviews, which we extract from an online database.
Interpretable Charge Predictions for Criminal Cases: Learning to Generate Court Views from Fact Descriptions In this paper, we propose to study the problem of court view generation from the fact description in a criminal case.
Adversarial Example Generation with Syntactically Controlled Paraphrase Networks We propose syntactically controlled paraphrase networks (SCPNs) and use them to generate adversarial examples.
Dialog Generation Using Multi-Turn Reasoning Neural Networks In this paper, we propose a generalizable dialog generation approach that adapts multi-turn reasoning, one recent advancement in the field of document comprehension, to generate responses (“answers”) by taking current conversation session context as a “document” and current query as a “question”.
Neural Text Generation in Stories Using Entity Representations as Context We introduce an approach to neural text generation that explicitly represents entities mentioned in the text.
short papers
Automatic Dialogue Generation with Expressed Emotions In this research, we address the problem of forcing the dialogue generation to express emotion.
Guiding Generation for Abstractive Text Summarization Based on Key Information Guide Network We propose a guiding generation model that combines the extractive method and the abstractive method.
Natural Language Generation by Hierarchical Decoding with Linguistic Patterns This paper introduces a hierarchical decoding NLG model based on linguistic patterns in different levels, and shows that the proposed method outperforms the traditional one with a smaller model size.
RankME: Reliable Human Ratings for Natural Language Generation **