The search functionality is under construction.

Author Search Result

[Author] Tian WANG(2hit)

1-2hit
  • Low Power High Performance FinFET Standard Cells Based on Mixed Back Biasing Technology

    Tian WANG  Xiaoxin CUI  Kai LIAO  Nan LIAO  Xiaole CUI  Dunshan YU  

     
    PAPER-Electronic Circuits

      Vol:
    E99-C No:8
      Page(s):
    974-983

    With the decrease in transistor feature size, power consumption, especially leakage power, has become a most important design concern. Because of their superior electrical properties and design flexibility, fin-type field-effect transistors (FinFETs) seem to be the most promising option in low-power applications. In order to support the VLSI digital system design flow based on logic synthesis, this paper proposes a design method for low-power high-performance standard cells based on IG-mode FinFETs. Such a method is derived on the basis of appropriately and artfully designing and optimizing the stacked structures in each standard cell, and applying the mixed forward and reverse back-gate bias technique in a well-chosen manner. The proposed method is also applicable when the supply voltage reduces to 0.5V to further reduce the leakage power consumption. By applying this design method, optimized IG-mode FinFET standard cells are generated, and they form a low-power high-performance standard cell library. Simulation results of the library cells indicate that the performance of the standard cells designed with the proposed method can be maintained while reducing leakage consumption by a factor of 58.9 at most. The 16-bit ripple carry adder implemented with this library can acquire up to 17.5% leakage power reduction.

  • Multi-Task Learning of Japanese How-to Tip Machine Reading Comprehension by a Generative Model

    Xiaotian WANG  Tingxuan LI  Takuya TAMURA  Shunsuke NISHIDA  Takehito UTSURO  

     
    PAPER-Natural Language Processing

      Pubricized:
    2023/10/23
      Vol:
    E107-D No:1
      Page(s):
    125-134

    In the research of machine reading comprehension of Japanese how-to tip QA tasks, conventional extractive machine reading comprehension methods have difficulty in dealing with cases in which the answer string spans multiple locations in the context. The method of fine-tuning of the BERT model for machine reading comprehension tasks is not suitable for such cases. In this paper, we trained a generative machine reading comprehension model of Japanese how-to tip by constructing a generative dataset based on the website “wikihow” as a source of information. We then proposed two methods for multi-task learning to fine-tune the generative model. The first method is the multi-task learning with a generative and extractive hybrid training dataset, where both generative and extractive datasets are simultaneously trained on a single model. The second method is the multi-task learning with the inter-sentence semantic similarity and answer generation, where, drawing upon the answer generation task, the model additionally learns the distance between the sentences of the question/context and the answer in the training examples. The evaluation results showed that both of the multi-task learning methods significantly outperformed the single-task learning method in generative question-and-answer examples. Between the two methods for multi-task learning, that with the inter-sentence semantic similarity and answer generation performed the best in terms of the manual evaluation result. The data and the code are available at https://github.com/EternalEdenn/multitask_ext-gen_sts-gen.