Electric Power ›› 2025, Vol. 58 ›› Issue (11): 156-163.DOI: 10.11930/j.issn.1004-9649.202411021

• New-Type Power Grid • Previous Articles     Next Articles

A Power Text Classification Model Based on Dual-layer Attention Mechanism

WU Tongxin1(), JI Xin1,2(), YANG Chengyue1(), CHEN Yiting1, YANG Zhiwei1   

  1. 1. State Grid Corporation of China Big Data Center, Beijing 100031, China
    2. Beijing University of Aeronautics and Astronautics, Beijing 100191, China
  • Received:2024-11-06 Revised:2025-06-20 Online:2025-12-01 Published:2025-11-28
  • Supported by:
    This work is supported by Science and Technology Project of SGCC (Research on Power Knowledge Extraction Technology Based on Graph Neural Network and Graph Deep Learning, No.52999021N005).

Abstract:

There are a large amount of Chinese text data in the electric power field, traditional text mining methods are faced with problems such as difficulty in word segmentation, limitations in text feature representation, and poor performance in handling complex relationships in text, which limit the deep understanding and classification of power information. This paper proposes a power text classification model that combines text convolutional neural networks (TextCNN) and Attention mechanism. A hierarchical optimization design was carried out for the input layer, TextCNN layer, first attention layer, pooling layer, second attention layer, and output layer, with experimental validation conducted to verify the model's performance. The results show that the proposed TextCNN-Attention model achieved a text classification accuracy of 96.8%, with a precision of 86.3%, a recall of 90.3%, and a F1 score (comprehensive evaluation metric) of 88.2% on the power text dataset, demonstrating the superior performance of the TextCNN-Attention model in processing power texts. This study can provide valuable experiences for application of deep learning in power text classification.

Key words: deep learning, electricity sector, text classification, convolutional neural networks, attention mechanism