Loading…

Adaptive Chinese Pinyin IME for Most Similar Representation

Many neural-network approaches are used for Pinyin-to-character (P2C) conversion in Chinese input method engines (IMEs). However, in previous research, the conversion efficiency of neural-network P2C models depends on the training of the data. Unfortunately, neural networks cannot maintain high perf...

Full description

Saved in:
Bibliographic Details
Published in:IEEE access 2022, Vol.10, p.1-1
Main Authors: Jiang, Dongsheng, Cheng, Xinyu, Han, Tianyi
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Many neural-network approaches are used for Pinyin-to-character (P2C) conversion in Chinese input method engines (IMEs). However, in previous research, the conversion efficiency of neural-network P2C models depends on the training of the data. Unfortunately, neural networks cannot maintain high performance with conversions across users and domains. In this study, we propose a method for improving the efficiency of model conversion and tracking user behavior based on dynamic storage and representations that can be updated using historical information from user input. Our experimental results show that our technique tracks user behavior and has strong domain adaptability without requiring additional training. For the cross-domain datasets Touchpal, cMedQA1.0, CAIL2019, compared with the direct use of neural network, its indicators, Top-1 MIU-Acc, CA and KySS, are improved by at least 20.0 8.1 18.3%, respectively, and the results are close to the in-domain training of the model. Furthermore, compared with the traditional methods On-OMWA and Google IME, this method improves at least 7.8 2.0 11.9% and 3.2 0.7 13.9% in Top-1 MIU-Acc, CA and Kyss, respectively. This demonstrates that the proposed method is superior to existing models in terms of conversion accuracy and generality, and can point a new path for P2C platforms.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2022.3218337