The search functionality is under construction.
The search functionality is under construction.

Local-to-Global Structure-Aware Transformer for Question Answering over Structured Knowledge

Yingyao WANG, Han WANG, Chaoqun DUAN, Tiejun ZHAO

  • Full Text Views

    0

  • Cite this

Summary :

Question-answering tasks over structured knowledge (i.e., tables and graphs) require the ability to encode structural information. Traditional pre-trained language models trained on linear-chain natural language cannot be directly applied to encode tables and graphs. The existing methods adopt the pre-trained models in such tasks by flattening structured knowledge into sequences. However, the serialization operation will lead to the loss of the structural information of knowledge. To better employ pre-trained transformers for structured knowledge representation, we propose a novel structure-aware transformer (SATrans) that injects the local-to-global structural information of the knowledge into the mask of the different self-attention layers. Specifically, in the lower self-attention layers, SATrans focus on the local structural information of each knowledge token to learn a more robust representation of it. In the upper self-attention layers, SATrans further injects the global information of the structured knowledge to integrate the information among knowledge tokens. In this way, the SATrans can effectively learn the semantic representation and structural information from the knowledge sequence and the attention mask, respectively. We evaluate SATrans on the table fact verification task and the knowledge base question-answering task. Furthermore, we explore two methods to combine symbolic and linguistic reasoning for these tasks to solve the problem that the pre-trained models lack symbolic reasoning ability. The experiment results reveal that the methods consistently outperform strong baselines on the two benchmarks.

Publication
IEICE TRANSACTIONS on Information Vol.E106-D No.10 pp.1705-1714
Publication Date
2023/10/01
Publicized
2023/06/27
Online ISSN
1745-1361
DOI
10.1587/transinf.2023EDP7034
Type of Manuscript
PAPER
Category
Artificial Intelligence, Data Mining

Authors

Yingyao WANG
  Harbin Institute of Technology
Han WANG
  Institute of Acoustics, Chinese Academy of Sciences,University of Chinese Academy of Sciences
Chaoqun DUAN
  Harbin Institute of Technology
Tiejun ZHAO
  Harbin Institute of Technology

Keyword