UPDF AI

Generation of Chinese Tang Dynasty Poetry Based on BERT Model

Zekai He,Jieshun You,Shun-Ping Lin,Ling Chen

2022 · DOI: 10.1145/3579895.3579940
International Conference on Network, Communication and Computing · 0 Citations

TLDR

It is suggested that the BERT model can generate higher quality and more various forms of poetry, which is of some certain reference and application value in the field of poetry.

Abstract

From prehistoric times to the present, the creation of poetry has long been considered the exclusive domain of humans. With the development of deep learning, many researchers have begun to address the challenge of how to generate poetry using algorithms. To capture more contextual continuity and semantically related information in Chinese poetry, this paper applies the BERT (Bidirectional Encoder Representations from Transformers) model with improvement in the full Tang dynasty poem dataset. In addition, this model is also used for inference to generate acrostic poetry and sequel poetry. Under the automatic evaluation metric BLEURT algorithm, Tang Dynasty poetry generated by the model used outperforms those generated by Long Short-Term Memory model. Good poetry generated from the model used was also approved by Chinese poets. This paper suggests that the BERT model can generate higher quality and more various forms of poetry, which is of some certain reference and application value in the field of poetry.