Enhancing Transformer-based language models with Commonsense Representations for Knowledge-driven Machine Comprehension
Li, R., Jiang, Z., Wang, L., Lu, X., Zhao, M. and Chen, D. (2021). Enhancing Transformer-based language models with Commonsense Representations for Knowledge-driven Machine Comprehension. Knowledge-Based Systems. 220, p. 106936. https://doi.org/10.1016/j.knosys.2021.106936
|Li, R., Jiang, Z., Wang, L., Lu, X., Zhao, M. and Chen, D.
Compared to the traditional machine reading comprehension (MRC) with limitation to the information in a passage, knowledge-driven MRC tasks aim to enable models to answer the question according to text and related commonsense knowledge. Although pre-trained Transformer-based language models (TrLMs) such as BERT and Roberta, have shown powerful perfor-mance in MRC, external knowledge such as unspoken commonsense and world knowledge still can not be used and explained explicitly. In this work, we present three simple yet eﬀective injection methods integrated into the structure of TrLMs to ﬁne-tune downstream knowledge-driven MRC tasks with oﬀ-the-shelf commonsense representations. Moreover, we introduce a mask mechanism for a token-level multi-hop relationship searching to ﬁl-ter external knowledge. Experimental results indicate that the incremental TrLMs have signiﬁcantly outperformed the baseline systems by 1%-4.1% on DREAM and CosmosQA, two prevalent knowledge-driven datasets. Further analysis shows the eﬀectiveness of the proposed methods and the robustness of the incremental model in the case of an incomplete training set.
|Machine Reading Comprehension; Transformer; Commonsense
|220, p. 106936
|Digital Object Identifier (DOI)
|06 Mar 2021
|Publication process dates
|04 Mar 2021
|18 Mar 2021
|Accepted author manuscript
File Access Level
2views this month
3downloads this month