Welcome to Francis Academic Press

Academic Journal of Computing & Information Science, 2023, 6(13); doi: 10.25236/AJCIS.2023.061314.

BiLSTM Text Classification Incorporating Attentional Mechanisms

Author(s)

Hao Qin

Corresponding Author:
Hao Qin
Affiliation(s)

Tianjin University of Commerce, Tianjin, 300134, China

Abstract

Most of the current research on text classification studies feature extraction or global information extraction at the surface level, which considers all features as important features, greatly increasing the model computational power. The existing text classification model bi-directional long short-term memory (BiLSTM) can learn text contextual information, but cannot be targeted to the extraction of important features and special attention. This paper incorporates the attention mechanism into the BiLSTM model, so that the model in the learning of contextual information, while the model This paper integrates the attention mechanism into the BiLSTM model, which makes the model learn contextual information while extracting locally important feature information, reduces the number of meaningless text features, and finally speeds up the convergence of the model, and ultimately achieves a higher accuracy of the classification results.

Keywords

BiLSTM, text classification, attention mechanism

Cite This Paper

Hao Qin. BiLSTM Text Classification Incorporating Attentional Mechanisms. Academic Journal of Computing & Information Science (2023), Vol. 6, Issue 13: 92-98. https://doi.org/10.25236/AJCIS.2023.061314.

References

[1] Yan Jiayu. Research on Chinese text classification based on improved recurrent neural network [D]. Nanjing University of Information Engineering, 2023.DOI:10.27248/d.cnki.gnjqc.2022.000576.

[2] Zhang, Chong. Research on text classification techniques based on Attention-Based LSTM model [D]. Nanjing University, 2016.

[3] Ruishuang Wang; Zhao Li; Jian Cao; Tong Chen; Lei Wang; "Convolutional Recurrent Neural Networks for Text Classification",2019 International Joint Conference On Neural Networks ..., 2019.

[4] Liang Yao;Chengsheng Mao;Yuan Luo; "Graph Convolutional Networks For Text Classification", AAAI, 2019.

[5] Cunxiao Du; Zhaozheng Chen; Fuli Feng; Lei Zhu; Tian Gan; Liqiang Nie; "Explicit Interaction Model Towards Text Classification", AAAI, 2019.

[6] Yujia Bao; Menghua Wu; Shiyu Chang; Regina Barzilay; "Few-shot Text Classification With Distributional Signatures", ARXIV-CS.CL, 2019.

[7] Lianzhe Huang; Dehong Ma; Sujian Li; Xiaodong Zhang; Houfeng WANG; "Text Level Graph Neural Network For Text Classification", EMNLP, 2019.

[8] Kanish Shah; Henil Patel; Devanshi Sanghvi; Manan Shah; "A Comparative Analysis of Logistic Regression, Random Forest and KNN Models for The Text Classification", AUGMENTED HUMAN RESEARCH,2020.

[9] Duo Chai; Wei Wu; Qinghong Han; Fei Wu; Jiwei Li; "Description Based Text Classification With Reinforcement Learning", ARXIV-CS.CL, 2020.

[10] Shervin Minaee; Nal Kalchbrenner; Erik Cambria; Narjes Nikzad; Meysam Chenaghlu; Jianfeng Gao; "Deep Learning Based Text Classification: A Comprehensive Review", ARXIV-CS.CL, 2020.

[11] Wei Wu; Duo Chai; Qinghong Han; Fei Wu; Jiwei Li; "Description Based Text Classification with Reinforcement Learning",ICML,2020.

[12] Shervin Minaee; Nal Kalchbrenner;Erik Cambria; Narjes Nikzad; Meysam Chenaghlu; Jianfeng Gao;"Deep Learning--based Text Classification",ACM COMPUTING SURVEYS (CSUR),2021.

[13] Kang Lei. Deep neural network in short text classification [D]. Lanzhou University,2021. DOI:10.27204/d.cnki.glzhu.2021.002138

[14] W.Q. Xu. Research on text classification algorithm based on graph neural network[D]. Jilin University, 2022.DOI:10. 27162/d. cnki. gjlin. 2022.007439