Class conditioned text generation with style attention mechanism for embracing diversity

Naae Kwon, Yuenkyung Yoo, Byunghan Lee

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

In the field of artificial intelligence and natural language processing (NLP), natural language generation (NLG) has significantly advanced. Its primary aim is to automatically generate text in a manner resembling human language. Traditional text generation has mainly focused on binary style transfers, limiting the scope to simple transformations between positive and negative tones or between modern and ancient styles. However, accommodating style diversity in real scenarios presents greater complexity and demand. Existing methods usually fail to capture the richness of diverse styles, hindering their utility in practical applications. To address these limitations, we propose a multi-class conditioned text generation model. We overcome previous constraints by utilizing a transformer-based decoder equipped with adversarial networks and style-attention mechanisms to model various styles in multi-class text. According to our experimental results, the proposed model achieved better performance compared to the alternatives on multi-class text generation tasks in terms of diversity while it preserves fluency. We expect that our study will help researchers not only train their models but also build simulated multi-class text datasets for further research.

Original languageEnglish
Article number111893
JournalApplied Soft Computing
Volume163
DOIs
StatePublished - Sep 2024

Keywords

  • Multi-class
  • Natural language generation
  • Non-parallel
  • Style attention
  • Text style

Fingerprint

Dive into the research topics of 'Class conditioned text generation with style attention mechanism for embracing diversity'. Together they form a unique fingerprint.

Cite this