Welcome to Francis Academic Press

Academic Journal of Computing & Information Science, 2019, 2(2); doi: 10.25236/AJCIS.010036.

Research on Intelligent Writing Poetry Model Based on Neural Network


Hanyu Liu

Corresponding Author:
Hanyu Liu

Software College, Jiangxi Normal University, Nanchang 330022, China


Poetry is a special and important cultural heritage in human history. The arrival of the era of big data and the development of deep learning technology have promoted the innovation of algorithm design ideas in the field of artificial intelligence. Deep learning has also developed greatly in poetry generation technology. Even to a certain extent, poetry produced may be difficult for ordinary people to distinguish. However, at present, poetry generation technology still focus on poetry avoid of thoughts and feelings and human spirituality. The work of machine poetry began in the 1970s. The traditional method of poetry generation relies heavily on the professional knowledge of the field of poetry. It requires experts to design a large number of artificial rules to constrain the rhythm and quality of the generated poetry. With the development of deep learning technology, the study of poetry generation has entered a new stage. Many methods of poetry generation based on deep learning have been proposed, such as the method based on RNN language model, the framework based on encoder-decoder, and the method of poetry generation based on GAN. This paper needs to modify this reality from human writing. Based on the iterative polishing mechanism proposed by Rui Yan et al., it adds a more perfect poetry evaluation system and proposes intelligent writing based on deep learning that can be automatically modified for multiple times. The poetry model, after trial and manual evaluation, proves that the method has certain effects.


deep learning, poetry generation, NLP

Cite This Paper

Hanyu Liu. Research on Intelligent Writing Poetry Model Based on Neural Network. Academic Journal of Computing & Information Science (2019), Vol. 2, Issue 2: 31-38. https://doi.org/10.25236/AJCIS.010036.


[1] T. Mikolov, M. Karafifiat, L. Burget, J. Cernock y, and S. Khudanpur. Recurrent neural network based language model. In INTERSPEECH, pages 1045–1048, 2010.
[2] Zhang, X., and Lapata, M. 2014. Chinese poetry generation with recurrent neural networks. In EMNLP, 670–680.
[3] L. Yu, W. Zhang, J. Wang, and Y. Yu. Seqgan: Sequence generative adversarial nets with policy gradient. arXiv preprint arXiv:1609.05473, 2016. 2
[4] R. Yan. i, poet: Automatic poetry composition through recurrent neural networks with iterative polishing schema. In IJCAI, 2016.
[5] Tomas Mikolov, Kai Chen, Greg Corrado, and Jeffrey Dean. Effificient estimation of word representations in vector space. arXiv preprint arXiv:1301.3781, 2013.
[6] Baotian Hu, Zhengdong Lu, Hang Li, and Qingcai Chen. Convolutional neural network architectures for matching natural language sentences. In NIPS, pages 2042–2050, 2014.