TY - JOUR
T1 - ALBERT with Knowledge Graph Encoder Utilizing Semantic Similarity for Commonsense Question Answering
AU - Choi, Byeongmin
AU - Lee, Yonghyun
AU - Kyung, Yeunwoong
AU - Kim, Eunchan
N1 - Publisher Copyright:
© 2023, Tech Science Press. All rights reserved.
PY - 2023
Y1 - 2023
N2 - Recently, pre-trained language representation models such as bidirectional encoder representations from transformers (BERT) have been performing well in commonsense question answering (CSQA). However, there is a problem that the models do not directly use explicit information of knowledge sources existing outside. To augment this, additional methods such as knowledge-aware graph network (KagNet) and multi-hop graph relation network (MHGRN) have been proposed. In this study, we propose to use the latest pre-trained language model a lite bidirectional encoder representations from transformers (ALBERT) with knowledge graph information extraction technique. We also propose to applying the novel method, schema graph expansion to recent language models. Then, we analyze the effect of applying knowledge graph-based knowledge extraction techniques to recent pre-trained language models and confirm that schema graph expansion is effective in some extent. Furthermore, we show that our proposed model can achieve better performance than existing KagNet and MHGRN models in CommonsenseQA dataset.
AB - Recently, pre-trained language representation models such as bidirectional encoder representations from transformers (BERT) have been performing well in commonsense question answering (CSQA). However, there is a problem that the models do not directly use explicit information of knowledge sources existing outside. To augment this, additional methods such as knowledge-aware graph network (KagNet) and multi-hop graph relation network (MHGRN) have been proposed. In this study, we propose to use the latest pre-trained language model a lite bidirectional encoder representations from transformers (ALBERT) with knowledge graph information extraction technique. We also propose to applying the novel method, schema graph expansion to recent language models. Then, we analyze the effect of applying knowledge graph-based knowledge extraction techniques to recent pre-trained language models and confirm that schema graph expansion is effective in some extent. Furthermore, we show that our proposed model can achieve better performance than existing KagNet and MHGRN models in CommonsenseQA dataset.
KW - Commonsense reasoning
KW - knowledge graph
KW - language representation model
KW - question answering
UR - https://www.scopus.com/pages/publications/85139238213
U2 - 10.32604/iasc.2023.032783
DO - 10.32604/iasc.2023.032783
M3 - Article
AN - SCOPUS:85139238213
SN - 1079-8587
VL - 36
SP - 71
EP - 82
JO - Intelligent Automation and Soft Computing
JF - Intelligent Automation and Soft Computing
IS - 1
ER -