Loading…
An Attention-Based BiLSTM-CRF Model for Chinese Clinic Named Entity Recognition
Clinic Named Entity Recognition (CNER) aims to recognize named entities such as body part, disease and symptom from Electronic Health Records (EHRs), which can benefit many intelligent biomedical systems. In recent years, more and more attention has been paid to the end-to-end CNER with recurrent ne...
Saved in:
Published in: | IEEE access 2019, Vol.7, p.113942-113949 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Clinic Named Entity Recognition (CNER) aims to recognize named entities such as body part, disease and symptom from Electronic Health Records (EHRs), which can benefit many intelligent biomedical systems. In recent years, more and more attention has been paid to the end-to-end CNER with recurrent neural networks (RNNs), especially for long short-term memory networks (LSTMs). However, it remains a great challenge for RNNs to capture long range dependencies. Moreover, Chinese presents additional challenges, since it uses logograms instead of alphabets, the ambiguities of Chinese word and has no word boundaries. In this work, we present a BiLSTM-CRF with self-attention mechanism (Att-BiLSTM-CRF) model for Chinese CNER task, which aims to address these problems. Self-attention mechanism can learn long range dependencies by establishing a direct connection between each character. In order to learn more semantic information about Chinese characters, we propose a novel fine-grained character-level representation method. We also introduce part-of-speech (POS) labeling information about our model to capture the semantic information in input sentence. We conduct the experiment by using CCKS-2017 Shared Task 2 dataset to evaluate performance, and the experimental results indicated that our model outperforms other state-of-the-art methods. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2019.2935223 |