首页> 外文期刊>Journal of Physics: Conference Series >Recurrent Convolution Attention Model (RCAM) for Text Generation based on Title
【24h】

Recurrent Convolution Attention Model (RCAM) for Text Generation based on Title

机译:基于标题的递归卷积注意力模型(RCAM)用于文本生成

获取原文
           

摘要

Natural Language Generation (NLG) is one of the most important part in Natural Language Processing (NLP). Recently, generating text automatically with deep learning method has been improved a lot. While there are lots of defects in text generation such as the quality is not satisfied and the text of title is not clear. The paper used the recurrent convolution attention model with LSTM (Long Short-Term Memory) cells for text generation by giving a title. The result proved that it can generate sentence according with the title and make the text express more fluently. Moreover, it uses less time to train by contrast with the SeqGAN (Sequence Generative Adversarial Networks). At the same time, the result is better than other attention mechanism with LSTM models. Therefore, it has more significance for NLP research.
机译:自然语言生成(NLG)是自然语言处理(NLP)中最重要的部分之一。最近,使用深度学习方法自动生成文本已得到很大改进。虽然在文本生成中存在许多缺陷,例如,质量不令人满意,标题文本不清晰。本文使用带有LSTM(长短期记忆)单元的循环卷积注意力模型,通过给出标题来生成文本。结果证明,它可以根据标题生成句子,使文本表达更流畅。此外,与SeqGAN(序列生成对抗网络)相比,它花费的时间更少。同时,结果优于LSTM模型的其他注意机制。因此,它对自然语言处理研究具有更重要的意义。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号