Natural Language Generation (NLG) is one of the most important part in Natural Language Processing (NLP). Recently, generating text automatically with deep learning method has been improved a lot. While there are lots of defects in text generation such as the quality is not satisfied and the text of title is not clear. The paper used the recurrent convolution attention model with LSTM (Long Short-Term Memory) cells for text generation by giving a title. The result proved that it can generate sentence according with the title and make the text express more fluently. Moreover, it uses less time to train by contrast with the SeqGAN (Sequence Generative Adversarial Networks). At the same time, the result is better than other attention mechanism with LSTM models. Therefore, it has more significance for NLP research.
展开▼