Redirigiendo al acceso original de articulo en 22 segundos...
Inicio  /  Applied Sciences  /  Vol: 13 Par: 14 (2023)  /  Artículo
ARTÍCULO
TITULO

A Multi-Layer Feature Fusion Model Based on Convolution and Attention Mechanisms for Text Classification

Hua Yang    
Shuxiang Zhang    
Hao Shen    
Gexiang Zhang    
Xingquan Deng    
Jianglin Xiong    
Li Feng    
Junxiong Wang    
Haifeng Zhang and Shenyang Sheng    

Resumen

Text classification is one of the fundamental tasks in natural language processing and is widely applied in various domains. CNN effectively utilizes local features, while the Attention mechanism performs well in capturing content-based global interactions. In this paper, we propose a multi-layer feature fusion text classification model called CAC, based on the Combination of CNN and Attention. The model adopts the idea of first extracting local features and then calculating global attention, while drawing inspiration from the interaction process between membranes in membrane computing to improve the performance of text classification. Specifically, the CAC model utilizes the local feature extraction capability of CNN to transform the original semantics into a multi-dimensional feature space. Then, global attention is computed in each respective feature space to capture global contextual information within the text. Finally, the locally extracted features and globally extracted features are fused for classification. Experimental results on various public datasets demonstrate that the CAC model, which combines CNN and Attention, outperforms models that solely rely on the Attention mechanism. In terms of accuracy and performance, the CAC model also exhibits significant improvements over other models based on CNN, RNN, and Attention.