论文标题
语法驱动的迭代扩展语言模型,用于可控文本生成
Syntax-driven Iterative Expansion Language Models for Controllable Text Generation
论文作者
论文摘要
主导语言建模范式将文本作为一系列离散令牌处理。尽管该方法可以捕获文本的潜在结构,但它本质地将其限制在文本生成的顺序动力学上。我们提出了一个新的范式,用于将句法诱导偏置引入神经文本生成,其中依赖性解析树用于驱动变压器模型以迭代生成句子。 我们的实验表明,该范式在文本生成方面有效,LSTMS和变压器之间的质量以及可比的多样性,需要少于一半的解码步骤,并且其生成过程允许直接控制生成的文本的句法结构,从而实现了诱导风格变化。
The dominant language modeling paradigm handles text as a sequence of discrete tokens. While that approach can capture the latent structure of the text, it is inherently constrained to sequential dynamics for text generation. We propose a new paradigm for introducing a syntactic inductive bias into neural text generation, where the dependency parse tree is used to drive the Transformer model to generate sentences iteratively. Our experiments show that this paradigm is effective at text generation, with quality between LSTMs and Transformers, and comparable diversity, requiring less than half their decoding steps, and its generation process allows direct control over the syntactic constructions of the generated text, enabling the induction of stylistic variations.