Skip to content

eaglenlp/Text-Generation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

20 Commits
 
 

Repository files navigation

Text Generation

ACL 2019

Using Semantic Similarity as Reward for Reinforcement Learning in Sentence Generation. ACL 2019. [PDF]
Towards Generating Long and Coherent Text with Multi-Level Latent Variable Models. ACL 2019. [PDF]
A Cross-Domain Transferable Neural Coherence Model. ACL 2019. [PDF]
Sentence Mover's Similarity Automatic Evaluation for Multi-Sentence Texts. ACL 2019. [PDF]
Enhancing Variational Autoencoders with Mutual Information Neural Estimation for Text Generation. ACL 2019. [PDF]

ACL 2018

A Graph-to-Sequence Model for AMR-to-Text Generation. ACL 2018. [PDF]

EMNLP 2019

Sentence-Level Content Planning and Style Specification for Neural Text Generation. EMNLP 2019. [PDF]
Denoising-based Sequence-to-Sequence Pre-training for Text Generation. EMNLP 2019. [PDF]
A Topic Augmented Text Generation Model: Joint Learning of Semantics and Structural Features. EMNLP 2019. [PDF]
ARAML: A Stable Adversarial Training Framework for Text Generation. EMNLP 2019. [PDF]
Deep Copycat Networks for Text-to-Text Generation. EMNLP 2019. [PDF]
Enhancing AMR-to-Text Generation with Dual Graph Representations. EMNLP 2019. [PDF]
Enhancing Neural Data-To-Text Generation Models with External Background Knowledge. EMNLP 2019. [PDF]
Implicit Deep Latent Variable Models for Text Generation. EMNLP 2019. [PDF]
Long and Diverse Text Generation with Planning-based Hierarchical Variational Model. EMNLP 2019. [PDF]
Modeling Graph Structure in Transformer for Better AMR-to-Text Generation. EMNLP 2019. [PDF]
MoverScore: Text Generation Evaluating with Contextualized Embeddings and Earth Mover Distance. EMNLP 2019. [PDF]
Neural data-to-text generation: A comparison between pipeline and end-to-end architectures. EMNLP 2019. [PDF]
Select and Attend: Towards Controllable Content Selection in Text Generation. EMNLP 2019. [PDF]
Table-to-Text Generation with Effective Hierarchical Encoder on Three dimensions (Row, Column and Time). EMNLP 2019. [PDF]
Autoregressive Text Generation beyond Feedback Loops. EMNLP 2019. [PDF]

AAAI 2020

Attractive or Faithful? Popularity‐Reinforced Learning for Inspired Headline Generation. [PDF]
Learning to Compare for Better Training and Evaluation of Open Domain Text Generation Models. [PDF]
An Iterative Polishing Framework based on Quality Aware Masked Language Model for Chinese Poetry. [PDF]
Generation. [PDF]
Rank3DGAN: Semantic mesh generation using relative attributes. [PDF]
Label Error Correction and Generation Through Label Relationships. [PDF]
On the Generation of Medical Question-Answer Pairs. [PDF]
Improving Question Generation with Sentence‐level Semantic Matching and Answer Position Inferring. [PDF]
Sentence Generation for Entity Description with Content‐plan Attention. [PDF]
Recurrent Nested Model for Sequence Generation. [PDF]
Active Learning with Query Generation for Cost‐Effective Text Classification. [PDF]
Conclusion‐Supplement Answer Generation for Non‐Factoid Questions. [PDF]
Learning from Easy to Complex: Adaptive Multi‐curricula Learning for Neural Dialogue Generation. [PDF]
Improving Knowledge‐aware Dialogue Generation via Knowledge Base Question Answering. [PDF]
MixPoet: Diverse Poetry Generation via Learning Controllable Mixed Latent Space. [PDF]
CatGAN: Category‐aware Generative Adversarial Networks with Hierarchical Evolutionary Learning for Category Text Generation. [PDF]
Structure Learning for Headline Generation. [PDF]
Joint Learning of Answer Selection and Answer Summary Generation in Community Question Answering. [PDF]
A Pre‐training Based Personalized Dialogue Generation Model with Persona‐sparse Data. [PDF]
Automatic Generation of Headlines for Online Math Questions. [PDF]
A Character‐Centric Neural Model for Automated Story Generation. [PDF]
A Dataset for Low-Resource Stylized Sequence-to‐Sequence Generation. [PDF]
Complementary Auxiliary Classifiers for Label‐Conditional Text Generation. [PDF]
TreeGen: A Tree‐Based Transformer Architecture for Code Generation. [PDF]
Cross-Lingual Natural Language Generation via Pre‐Training. [PDF]
Neural Question Generation with Answer Pivot. [PDF]
Open Domain Event Text Generation. [PDF]
Joint Parsing and Generation for Abstractive Summarization. [PDF]
Capturing Greater Context for Question Generation. [PDF]
A Meta Cooperative Training Paradigm for Improving Adversarial Text Generation. [PDF]
Sequence Generation with Optimal‐Transport‐Enhanced Reinforcement Learning. [PDF]
MALA: Cross‐Domain Dialogue Generation with Action Learning. [PDF]

AAAI 2019

Differentiated Distribution Recovery for Neural Text Generation. AAAI 2019. [PDF]
Data-to-Text Generation with Content Selection and Planning. AAAI 2019. [PDF]
Hierarchical Encoder with Auxiliary Supervision for Table-to-text Generation: Learning Better Representation for Tables. AAAI 2019. [PDF] \

ParaBank: Monolingual Bitext Generation and Sentential Paraphrasing via Lexicallyconstrained Neural Machine Translation. AAAI 2019. [PDF] \

AAAI 2018

Controlling Global Statistics in Recurrent Neural Network Text Generation. AAAI 2018. [PDF]
Long Text Generation via Adversarial Training with Leaked Information. AAAI 2018. [PDF]
Order-Planning Neural Text Generation From Structured Data. AAAI 2018. [PDF] Table-to-text Generation by Structure-aware Seq2seq Learning. AAAI 2018. [PDF]

ECCV 2018

Diverse and Coherent Paragraph Generation from Images. ECCV 2018. [PDF]

NAACL 2019

Topic-Guided Variational Auto-Encoder for Text Generation. [PDF]
We propose a topic-guided variational auto-encoder (TGVAE) model for text generation.

Keyphrase Generation: A Text Summarization Struggle. [PDF]
In this paper, we explore the possibility of considering the keyphrase string as an abstractive summary of the title and the abstract. First, we collect, process and release a large dataset of scientific paper metadata that contains 2.2 million records.

Jointly Optimizing Diversity and Relevance in Neural Response Generation. [PDF]
In this paper, we propose a SpaceFusion model to jointly optimize diversity and relevance that essentially fuses the latent space of a sequence-to-sequence model and that of an autoencoder model by leveraging novel regularization terms.

Improving Human Text Comprehension through Semi-Markov CRF-based Neural Section Title Generation. [PDF]
In particular, we present an extractive pipeline for section title generation by first selecting the most salient sentence and then applying deletion-based compression.

Unifying Human and Statistical Evaluation for Natural Language Generation. [PDF]
In this paper, we propose a unified framework which evaluates both diversity and quality, based on the optimal error rate of predicting whether a sentence is human- or machine-generated.

What makes a good conversation? How controllable attributes affect human judgments. [PDF]
In this work, we examine two controllable neural text generation methods, conditional training and weighted decoding, in order to control four important attributes for chit-chat dialogue: repetition, specificity, response-relatedness and questionasking.

Pun Generation with Surprise. [PDF]
In this paper, we propose an unsupervised approach to pun generation based on lots of raw (unhumorous) text and a surprisal principle.

Latent Code and Text-based Generative Adversarial Networks for Soft-text Generation. [PDF]
In this work, we introduce a novel text-based approach called Soft-GAN to effectively exploit GAN setup for text generation.

Neural Text Generation from Rich Semantic Representations. [PDF]
We propose neural models to generate high-quality text from structured representations based on Minimal Recursion Semantics (MRS).

Step-by-Step: Separating Planning from Realization in Neural Data-to-Text Generation. [PDF]
For training a plan-to-text generator, we present a method for matching reference texts to their corresponding text plans.

Evaluating Rewards for Question Generation Models. [PDF]
We therefore optimise directly for various objectives beyond simply replicating the ground truth questions, including a novel approach using an adversarial discriminator that seeks to generate questions that are indistinguishable from real examples.

Text Generation from Knowledge Graphs with Graph Transformers. [PDF]
In this work, we address the problem of generating coherent multi-sentence texts from the output of an information extraction system, and in particular a knowledge graph.

Text Generation with Exemplar-based Adaptive Decoding. [PDF]
We propose a novel conditioned text generation model.

Towards Content Transfer through Grounded Text Generation. [PDF]
This paper introduces the notion of Content Transfer for long-form text generation, where the task is to generate a next sentence in a document that both fits its context and is grounded in a content-rich external textual source such as a news story. As another contribution of this paper, we release a benchmark dataset of 640k Wikipedia referenced sentences paired with the source articles to encourage exploration of this new task.

Semantically-Aligned Equation Generation for Solving and Reasoning Math Word Problems. [PDF]
Motivated by the intuition about how human generates the equations given the problem texts, this paper presents a neural approach to automatically solve math word problems by operating symbols according to their semantic meanings in texts.

An Integrated Approach for Keyphrase Generation via Exploring the Power of Retrieval and Extraction. [PDF]
In this paper, we present a novel integrated approach for keyphrase generation (KG).

Accelerated Reinforcement Learning for Sentence Generation by Vocabulary Prediction. [PDF]
To improve the efficiency of reinforcement learning, we present a novel approach for reducing the action space based on dynamic vocabulary prediction.

Corpora Generation for Grammatical Error Correction. [PDF]
We describe two approaches for generating large parallel datasets for GEC using publicly available Wikipedia data.

Structural Neural Encoders for AMR-to-text Generation. [PDF]
We investigate the extent to which reentrancies (nodes with multiple parents) have an impact on AMR-to-text generation by comparing graph encoders to tree encoders, where reentrancies are not preserved.

Affect-Driven Dialog Generation. [PDF]
In this paper, we present an affect-driven dialog system, which generates emotional responses in a controlled manner using a continuous representation of emotions.

Pre-trained language model representations for language generation. [PDF]
In this paper, we examine different strategies to integrate pre-trained representations into sequence to sequence models and apply it to neural machine translation and abstractive summarization.

Pragmatically Informative Text Generation. [PDF]
We consider two pragmatic modeling methods for text generation: one where pragmatics is imposed by information preservation, and another where pragmatics is imposed by explicit modeling of distractors.

Stochastic Wasserstein Autoencoder for Probabilistic Sentence Generation. [PDF]
In this paper, we propose to use the Wasserstein autoencoder (WAE) for probabilistic sentence generation, where the encoder could be either stochastic or deterministic.

NAACL 2018

Long Papers

Discourse-Aware Neural Rewards for Coherent Text Generation. [PDF]
In this paper, we investigate the use of discourse-aware rewards with reinforcement learning to guide a model to generate long, coherent text.

Neural Text Generation in Stories Using Entity Representations as Context. [PDF]
We introduce an approach to neural text generation that explicitly represents entities mentioned in the text.

A Deep Ensemble Model with Slot Alignment for Sequence-to-Sequence Natural Language Generation. [PDF]
We describe an ensemble neural language generator, and present several novel methods for data representation and augmentation that yield improved results in our model.

Natural Answer Generation with Heterogeneous Memory. [PDF]
In this work, we propose a novel attention mechanism to encourage the decoder to actively interact with the memory by taking its heterogeneity into account.

Query and Output: Generating Words by Querying Distributed Word Representations for Paraphrase Generation. [PDF]
We present a neural model for question generation from knowledge graphs triples in a “Zero-shot” setup, that is generating questions for predicate, subject types or object types that were not seen at training time.

Zero-Shot Question Generation from Knowledge Graphs for Unseen Predicates and Entity Types. [PDF]
We present a neural model for question generation from knowledge graphs triples in a “Zero-shot” setup, that is generating questions for predicate, subject types or object types that were not seen at training time.

What’s This Movie About? A Joint Neural Network Architecture for Movie Content Analysis. [PDF]
We present a novel end-to-end model for overview generation, consisting of a multi-label encoder for identifying screenplay attributes, and an LSTM decoder to generate natural language sentences conditioned on the identified attributes. We create a dataset that consists of movie scripts, attribute-value pairs for the movies’ aspects, as well as overviews, which we extract from an online database.

Interpretable Charge Predictions for Criminal Cases: Learning to Generate Court Views from Fact Descriptions. [PDF]
In this paper, we propose to study the problem of court view generation from the fact description in a criminal case.

Adversarial Example Generation with Syntactically Controlled Paraphrase Networks. [PDF]
We propose syntactically controlled paraphrase networks (SCPNs) and use them to generate adversarial examples.

Dialog Generation Using Multi-Turn Reasoning Neural Networks. [PDF]
In this paper, we propose a generalizable dialog generation approach that adapts multi-turn reasoning, one recent advancement in the field of document comprehension, to generate responses (“answers”) by taking current conversation session context as a “document” and current query as a “question”.

Neural Text Generation in Stories Using Entity Representations as Context. [PDF]
We introduce an approach to neural text generation that explicitly represents entities mentioned in the text.

Short Papers

Automatic Dialogue Generation with Expressed Emotions. [PDF]
In this research, we address the problem of forcing the dialogue generation to express emotion.

Guiding Generation for Abstractive Text Summarization Based on Key Information Guide Network. [PDF]
We propose a guiding generation model that combines the extractive method and the abstractive method.

Natural Language Generation by Hierarchical Decoding with Linguistic Patterns. [PDF]
This paper introduces a hierarchical decoding NLG model based on linguistic patterns in different levels, and shows that the proposed method outperforms the traditional one with a smaller model size.

RankME: Reliable Human Ratings for Natural Language Generation. [PDF]
We present a novel rank-based magnitude estimation method (RankME), which combines the use of continuous scales and relative assessments.

Identifying the Most Dominant Event in a News Article by Mining Event Coreference Relations. [PDF]
Identifying the most dominant and central event of a document, which governs and connects other foreground and background events in the document, is useful for many applications, such as text summarization, storyline generation and text segmentation.

Leveraging Context Information for Natural Question Generation. [PDF]
We propose a model that matches the answer with the passage before generating the question.

TypeSQL: Knowledge-Based Type-Aware Neural Text-to-SQL Generation. [PDF]
In this paper, we present a novel approach TypeSQL which formats the problem as a slot filling task in a more reasonable way.

Learning to Generate Wikipedia Summaries for Underserved Languages from Wikidata. [PDF]
In this work, we investigate the generation of open domain Wikipedia summaries in underserved languages using structured data from Wikidata.

Natural Language to Structured Query Generation via Meta-Learning. [PDF]
In this work, we explore a different learning protocol that treats each example as a unique pseudo-task, by reducing the original learning problem to a few-shot meta-learning scenario with the help of a domain-dependent relevance function.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published