The search functionality is under construction.

Author Search Result

[Author] Jie LUO(2hit)

1-2hit
  • BRsyn-Caps: Chinese Text Classification Using Capsule Network Based on Bert and Dependency Syntax

    Jie LUO  Chengwan HE  Hongwei LUO  

     
    PAPER-Natural Language Processing

      Pubricized:
    2023/11/06
      Vol:
    E107-D No:2
      Page(s):
    212-219

    Text classification is a fundamental task in natural language processing, which finds extensive applications in various domains, such as spam detection and sentiment analysis. Syntactic information can be effectively utilized to improve the performance of neural network models in understanding the semantics of text. The Chinese text exhibits a high degree of syntactic complexity, with individual words often possessing multiple parts of speech. In this paper, we propose BRsyn-caps, a capsule network-based Chinese text classification model that leverages both Bert and dependency syntax. Our proposed approach integrates semantic information through Bert pre-training model for obtaining word representations, extracts contextual information through Long Short-term memory neural network (LSTM), encodes syntactic dependency trees through graph attention neural network, and utilizes capsule network to effectively integrate features for text classification. Additionally, we propose a character-level syntactic dependency tree adjacency matrix construction algorithm, which can introduce syntactic information into character-level representation. Experiments on five datasets demonstrate that BRsyn-caps can effectively integrate semantic, sequential, and syntactic information in text, proving the effectiveness of our proposed method for Chinese text classification.

  • A Matching Pursuit Generalized Approximate Message Passing Algorithm

    Yongjie LUO  Qun WAN  Guan GUI  Fumiyuki ADACHI  

     
    LETTER-Numerical Analysis and Optimization

      Vol:
    E98-A No:12
      Page(s):
    2723-2727

    This paper proposes a novel matching pursuit generalized approximate message passing (MPGAMP) algorithm which explores the support of sparse representation coefficients step by step, and estimates the mean and variance of non-zero elements at each step based on a generalized-approximate-message-passing-like scheme. In contrast to the classic message passing based algorithms and matching pursuit based algorithms, our proposed algorithm saves a lot of intermediate process memory, and does not calculate the inverse matrix. Numerical experiments show that MPGAMP algorithm can recover a sparse signal from compressed sensing measurements very well, and maintain good performance even for non-zero mean projection matrix and strong correlated projection matrix.