The search functionality is under construction.

Author Search Result

[Author] Mingming YANG(2hit)

1-2hit
  • Selective Retransmission Method for HARQ

    Bin SONG  Hao QIN  Mingming YANG  Lifeng GU  

     
    LETTER-Fundamental Theories for Communications

      Vol:
    E94-B No:3
      Page(s):
    796-797

    A new selective retransmission method for HARQ (Hybrid Automatic Repeat reQuest) systems is proposed. This method can avoid the blindness of symbols retransmission by the transformation of lostmap matrix and simulation results show that the proposed method can reduce the number of retransmissions effectively.

  • Neural Machine Translation with Target-Attention Model

    Mingming YANG  Min ZHANG  Kehai CHEN  Rui WANG  Tiejun ZHAO  

     
    PAPER-Natural Language Processing

      Pubricized:
    2019/11/26
      Vol:
    E103-D No:3
      Page(s):
    684-694

    Attention mechanism, which selectively focuses on source-side information to learn a context vector for generating target words, has been shown to be an effective method for neural machine translation (NMT). In fact, generating target words depends on not only the source-side information but also the target-side information. Although the vanilla NMT can acquire target-side information implicitly by recurrent neural networks (RNN), RNN cannot adequately capture the global relationship between target-side words. To solve this problem, this paper proposes a novel target-attention approach to capture this information, thus enhancing target word predictions in NMT. Specifically, we propose three variants of target-attention model to directly obtain the global relationship among target words: 1) a forward target-attention model that uses a target attention mechanism to incorporate previous historical target words into the prediction of the current target word; 2) a reverse target-attention model that adopts a reverse RNN model to obtain the entire reverse target words information, and then to combine with source context information to generate target sequence; 3) a bidirectional target-attention model that combines the forward target-attention model and reverse target-attention model together, which can make full use of target words to further improve the performance of NMT. Our methods can be integrated into both RNN based NMT and self-attention based NMT, and help NMT get global target-side information to improve translation performance. Experiments on the NIST Chinese-to-English and the WMT English-to-German translation tasks show that the proposed models achieve significant improvements over state-of-the-art baselines.