Loading…
Label-Aware Chinese Event Detection with Heterogeneous Graph Attention Network
Event detection (ED) seeks to recognize event triggers and classify them into the predefined event types. Chinese ED is formulated as a character-level task owing to the uncertain word boundaries. Prior methods try to incorporate word-level information into characters to enhance their semantics. How...
Saved in:
Published in: | Journal of computer science and technology 2024-02, Vol.39 (1), p.227-242 |
---|---|
Main Authors: | , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Event detection (ED) seeks to recognize event triggers and classify them into the predefined event types. Chinese ED is formulated as a character-level task owing to the uncertain word boundaries. Prior methods try to incorporate word-level information into characters to enhance their semantics. However, they experience two problems. First, they fail to incorporate word-level information into each character the word encompasses, causing the insufficient word-character interaction problem. Second, they struggle to distinguish events of similar types with limited annotated instances, which is called the event confusing problem. This paper proposes a novel model named Label-Aware Heterogeneous Graph Attention Network (L-HGAT) to address these two problems. Specifically, we first build a heterogeneous graph of two node types and three edge types to maximally preserve word-character interactions, and then deploy a heterogeneous graph attention network to enhance the semantic propagation between characters and words. Furthermore, we design a pushing-away game to enlarge the predicting gap between the ground-truth event type and its confusing counterpart for each character. Experimental results show that our L-HGAT model consistently achieves superior performance over prior competitive methods. |
---|---|
ISSN: | 1000-9000 1860-4749 |
DOI: | 10.1007/s11390-023-1541-6 |