Skip to content

eaglenlp/Text-Generation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

18 Commits
 
 

Repository files navigation

Text Generation

ACL 2019

Using Semantic Similarity as Reward for Reinforcement Learning in Sentence Generation. ACL 2019. [PDF]
Towards Generating Long and Coherent Text with Multi-Level Latent Variable Models. ACL 2019. [PDF]
A Cross-Domain Transferable Neural Coherence Model. ACL 2019. [PDF]
Sentence Mover's Similarity Automatic Evaluation for Multi-Sentence Texts. ACL 2019. [PDF]
Enhancing Variational Autoencoders with Mutual Information Neural Estimation for Text Generation. ACL 2019. [PDF]

ACL 2018

A Graph-to-Sequence Model for AMR-to-Text Generation. ACL 2018. [PDF]

EMNLP 2019

Sentence-Level Content Planning and Style Specification for Neural Text Generation. EMNLP 2019. [PDF]
Denoising-based Sequence-to-Sequence Pre-training for Text Generation. EMNLP 2019. [PDF]
A Topic Augmented Text Generation Model: Joint Learning of Semantics and Structural Features. EMNLP 2019. [PDF]
ARAML: A Stable Adversarial Training Framework for Text Generation. EMNLP 2019. [PDF]
Deep Copycat Networks for Text-to-Text Generation. EMNLP 2019. [PDF]
Enhancing AMR-to-Text Generation with Dual Graph Representations. EMNLP 2019. [PDF]
Enhancing Neural Data-To-Text Generation Models with External Background Knowledge. EMNLP 2019. [PDF]
Implicit Deep Latent Variable Models for Text Generation. EMNLP 2019. [PDF]
Long and Diverse Text Generation with Planning-based Hierarchical Variational Model. EMNLP 2019. [PDF]
Modeling Graph Structure in Transformer for Better AMR-to-Text Generation. EMNLP 2019. [PDF]
MoverScore: Text Generation Evaluating with Contextualized Embeddings and Earth Mover Distance. EMNLP 2019. [PDF]
Neural data-to-text generation: A comparison between pipeline and end-to-end architectures. EMNLP 2019. [PDF]
Select and Attend: Towards Controllable Content Selection in Text Generation. EMNLP 2019. [PDF]
Table-to-Text Generation with Effective Hierarchical Encoder on Three dimensions (Row, Column and Time). EMNLP 2019. [PDF]
Autoregressive Text Generation beyond Feedback Loops. EMNLP 2019. [PDF]

AAAI 2018

Controlling Global Statistics in Recurrent Neural Network Text Generation. AAAI 2018. [PDF]
Long Text Generation via Adversarial Training with Leaked Information. AAAI 2018. [PDF]
Order-Planning Neural Text Generation From Structured Data. AAAI 2018. [PDF] Table-to-text Generation by Structure-aware Seq2seq Learning. AAAI 2018. [PDF]

ECCV 2018

Diverse and Coherent Paragraph Generation from Images. ECCV 2018. [PDF]

AAAI 2019

Differentiated Distribution Recovery for Neural Text Generation. AAAI 2019. [PDF]
Data-to-Text Generation with Content Selection and Planning. AAAI 2019. [PDF]
Hierarchical Encoder with Auxiliary Supervision for Table-to-text Generation: Learning Better Representation for Tables. AAAI 2019. [PDF] \

ParaBank: Monolingual Bitext Generation and Sentential Paraphrasing via Lexicallyconstrained Neural Machine Translation. AAAI 2019. [PDF] \

NAACL 2018 Long Papers

Discourse-Aware Neural Rewards for Coherent Text Generation. [PDF]
In this paper, we investigate the use of discourse-aware rewards with reinforcement learning to guide a model to generate long, coherent text.

Neural Text Generation in Stories Using Entity Representations as Context. [PDF]
We introduce an approach to neural text generation that explicitly represents entities mentioned in the text.

A Deep Ensemble Model with Slot Alignment for Sequence-to-Sequence Natural Language Generation. [PDF]
We describe an ensemble neural language generator, and present several novel methods for data representation and augmentation that yield improved results in our model.

Natural Answer Generation with Heterogeneous Memory. [PDF]
In this work, we propose a novel attention mechanism to encourage the decoder to actively interact with the memory by taking its heterogeneity into account.

Query and Output: Generating Words by Querying Distributed Word Representations for Paraphrase Generation. [PDF]
We present a neural model for question generation from knowledge graphs triples in a “Zero-shot” setup, that is generating questions for predicate, subject types or object types that were not seen at training time.

Zero-Shot Question Generation from Knowledge Graphs for Unseen Predicates and Entity Types. [PDF]
We present a neural model for question generation from knowledge graphs triples in a “Zero-shot” setup, that is generating questions for predicate, subject types or object types that were not seen at training time.

What’s This Movie About? A Joint Neural Network Architecture for Movie Content Analysis. [PDF]
We present a novel end-to-end model for overview generation, consisting of a multi-label encoder for identifying screenplay attributes, and an LSTM decoder to generate natural language sentences conditioned on the identified attributes. We create a dataset that consists of movie scripts, attribute-value pairs for the movies’ aspects, as well as overviews, which we extract from an online database.

Interpretable Charge Predictions for Criminal Cases: Learning to Generate Court Views from Fact Descriptions. [PDF]
In this paper, we propose to study the problem of court view generation from the fact description in a criminal case.

Adversarial Example Generation with Syntactically Controlled Paraphrase Networks. [PDF]
We propose syntactically controlled paraphrase networks (SCPNs) and use them to generate adversarial examples.

Dialog Generation Using Multi-Turn Reasoning Neural Networks. [PDF]
In this paper, we propose a generalizable dialog generation approach that adapts multi-turn reasoning, one recent advancement in the field of document comprehension, to generate responses (“answers”) by taking current conversation session context as a “document” and current query as a “question”.

Neural Text Generation in Stories Using Entity Representations as Context. [PDF]
We introduce an approach to neural text generation that explicitly represents entities mentioned in the text.

NAACL 2018 Short Papers

Automatic Dialogue Generation with Expressed Emotions. [PDF]
In this research, we address the problem of forcing the dialogue generation to express emotion.

Guiding Generation for Abstractive Text Summarization Based on Key Information Guide Network. [PDF]
We propose a guiding generation model that combines the extractive method and the abstractive method.

Natural Language Generation by Hierarchical Decoding with Linguistic Patterns. [PDF]
This paper introduces a hierarchical decoding NLG model based on linguistic patterns in different levels, and shows that the proposed method outperforms the traditional one with a smaller model size.

RankME: Reliable Human Ratings for Natural Language Generation. [PDF]
We present a novel rank-based magnitude estimation method (RankME), which combines the use of continuous scales and relative assessments.

Identifying the Most Dominant Event in a News Article by Mining Event Coreference Relations. [PDF]
Identifying the most dominant and central event of a document, which governs and connects other foreground and background events in the document, is useful for many applications, such as text summarization, storyline generation and text segmentation.

Leveraging Context Information for Natural Question Generation. [PDF]
We propose a model that matches the answer with the passage before generating the question.

TypeSQL: Knowledge-Based Type-Aware Neural Text-to-SQL Generation. [PDF]
In this paper, we present a novel approach TypeSQL which formats the problem as a slot filling task in a more reasonable way.

Learning to Generate Wikipedia Summaries for Underserved Languages from Wikidata. [PDF]
In this work, we investigate the generation of open domain Wikipedia summaries in underserved languages using structured data from Wikidata.

Natural Language to Structured Query Generation via Meta-Learning. [PDF]
In this work, we explore a different learning protocol that treats each example as a unique pseudo-task, by reducing the original learning problem to a few-shot meta-learning scenario with the help of a domain-dependent relevance function.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published