원문정보
초록
영어
This study analyzed the Korean-Chinese translation error factors of machine translation services such as Papago and Google translate through self-attention path visualization. Self-Attention is a key method of the Transformer and Bert NLP model and recently widely used in machine translation. The study analyzes the difference in the Attention path between ST (source text) and ST' which meaning does not change and the translation output appears more accurately. Understanding these attention path pattern differences and each translation result TT (target text), and TT', the study summarized three types of errors: symbols missing error, grammar error, and multiple meaning words error. The study analyzed using the xlm-ReBerta multilingual NLP model provided by exBERT, for self-attention visualization, and suggesting some suggestions for error resolution. Through the study, it was found that the cause of translation error can be well identified using the Self-Attention visualization method, and through the explanation, machine translation algorithm developers can use it to improve service quality, and researchers can more understand machine translation algorithms.
목차
Introduction
exBERT
Methods
Conclusion
Implications
References