中文版 | English
题名

Hybrid attention-based transformer block model for distant supervision relation extraction

作者
通讯作者Jin,Yaochu
发表日期
2022-01-22
DOI
发表期刊
ISSN
0925-2312
EISSN
1872-8286
卷号470页码:29-39
摘要
With an exponential explosive growth of various digital text information, it is challenging to efficiently obtain specific knowledge from massive unstructured text information. As one basic task for natural language processing (NLP), relation extraction (RE) aims to extract semantic relations between entity pairs based on the given text. To avoid manual labeling of datasets, distant supervision relation extraction (DSRE) has been widely used, aiming to utilize knowledge base to automatically annotate datasets. Unfortunately, this method heavily suffers from wrong labelling due to its underlying strong assumptions. To address this issue, we propose a new framework using hybrid attention-based Transformer block with multi-instance learning for DSRE. More specifically, the Transformer block is, for the first time, used as a sentence encoder, which mainly utilizes multi-head self-attention to capture syntactic information at the word level. Then, a novel sentence-level attention mechanism is proposed to calculate the bag representation, aiming to exploit all useful information in each sentence. Experimental results on the public dataset New York Times (NYT) demonstrate that the proposed approach can outperform the state-of-the-art algorithms on the adopted dataset, which verifies the effectiveness of our model on the DSRE task.
关键词
相关链接[Scopus记录]
收录类别
SCI ; EI
语种
英语
学校署名
其他
资助项目
Fundamental Research Funds for the Central Universities["2232021A-10","2232021D-37"] ; National Natural Science Foundation of China[61806051] ; Natural Science Foundation of Shanghai["20ZR1400400","21ZR1401700"] ; Graduate Student Innovation Fund of Donghua University[CUSFDH-D-2021051]
WOS研究方向
Computer Science
WOS类目
Computer Science, Artificial Intelligence
WOS记录号
WOS:000722305600003
出版者
EI入藏号
20214611156163
EI主题词
Extraction ; Knowledge based systems ; Semantics
EI分类号
Data Processing and Image Processing:723.2 ; Expert Systems:723.4.1 ; Chemical Operations:802.3
ESI学科分类
COMPUTER SCIENCE
Scopus记录号
2-s2.0-85118890283
来源库
Scopus
引用统计
被引频次[WOS]:19
成果类型期刊论文
条目标识符http://sustech.caswiz.com/handle/2SGJ60CL/256297
专题工学院_计算机科学与工程系
作者单位
1.Engineering Research Center of Digitized Textile & Apparel Technology,Ministry of Education,Donghua University,Shanghai,201620,China
2.Chair of Nature Inspired Computing and Engineering,Faculty of Technology,Bielefeld University,Bielefeld,D-33615,Germany
3.Department of Computer Science,University of Surrey,Guildford,GU2 7XH,United Kingdom
4.The Shenzhen Key Laboratory of Computational Intelligence,University Key Laboratory of Evolving Intelligent Systems of Guangdong Province,Department of Computer Science and Engineering,Southern University of Science and Technology,Shenzhen,518055,China
推荐引用方式
GB/T 7714
Xiao,Yan,Jin,Yaochu,Cheng,Ran,et al. Hybrid attention-based transformer block model for distant supervision relation extraction[J]. NEUROCOMPUTING,2022,470:29-39.
APA
Xiao,Yan,Jin,Yaochu,Cheng,Ran,&Hao,Kuangrong.(2022).Hybrid attention-based transformer block model for distant supervision relation extraction.NEUROCOMPUTING,470,29-39.
MLA
Xiao,Yan,et al."Hybrid attention-based transformer block model for distant supervision relation extraction".NEUROCOMPUTING 470(2022):29-39.
条目包含的文件
条目无相关文件。
个性服务
原文链接
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
导出为Excel格式
导出为Csv格式
Altmetrics Score
谷歌学术
谷歌学术中相似的文章
[Xiao,Yan]的文章
[Jin,Yaochu]的文章
[Cheng,Ran]的文章
百度学术
百度学术中相似的文章
[Xiao,Yan]的文章
[Jin,Yaochu]的文章
[Cheng,Ran]的文章
必应学术
必应学术中相似的文章
[Xiao,Yan]的文章
[Jin,Yaochu]的文章
[Cheng,Ran]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
[发表评论/异议/意见]
暂无评论

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。