Paper
15 August 2023 Chinese text summarization generation based on transformer and temporal convolutional network
Author Affiliations +
Proceedings Volume 12719, Second International Conference on Electronic Information Technology (EIT 2023); 127191P (2023) https://doi.org/10.1117/12.2685723
Event: Second International Conference on Electronic Information Technology (EIT 2023), 2023, Wuhan, China
Abstract
The recurrent neural network model based on attention mechanism has achieved good results in the text summarization generation task, but such models have problems such as insufficient parallelism and exposure bias. In order to solve the above problems, this paper proposes a two-stage Chinese text summarization generation method based on Transformer and temporal convolutional network. The first stage uses a summary generation model that fuses Transformer and a temporal convolutional network, and generates multiple candidate summaries through beam search at the decoding end. In the second stage, contrastive learning is introduced, and the candidate summaries are sorted and scored using the Roberta model to select the final summary. Through experiments on the Chinese short text summarization dataset LCSTS, ROUGE was used as the evaluation method to verify the effectiveness of the proposed method on Chinese text summarization.
© (2023) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Wenming Huang, Yaowei Zhou, Yannan Xiao, Yayuan Wen, and Zhenrong Deng "Chinese text summarization generation based on transformer and temporal convolutional network", Proc. SPIE 12719, Second International Conference on Electronic Information Technology (EIT 2023), 127191P (15 August 2023); https://doi.org/10.1117/12.2685723
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Transformers

Convolution

Data modeling

Education and training

Head

Matrices

Neural networks

Back to Top