earticle

논문검색

신경망기계번역 기술 진화와 번역품질 분석

원문정보

A Study on the Evolution of Neural Machine Translation Technology and Translation Quality.

지인영, 김희동

피인용수 : 0(자료제공 : 네이버학술정보)

초록

영어

There was a big technical progress in the research field of machine translation: the main approach has switched from statistical machine translation (SMT) to neural machine translation (NMT), leading to dramatic improvements in translation quality. Recently, another progress has been taking place from recurrent neural network(RNN)-based NMT to transformer-based NMT (T-NMT). As the performance of NMT has evolved, a lot of research papers for machine translation have been published in the field of interpretation and translation. Their main focus is on whether machine translation can replace human translation, and analyzing the quality of translation results. In this paper, we briefly explain the history of the machine translation research and review the mechanism of NMT. NMT is basically composed of three parts: encoder, attention mechanism, and decoder. Further we discuss the new transformer structure based on the encoder-decoder model. We also discuss the challenges in NMT and explain the research direction or solutions to the problems. Particular attention is given to the mistranslation of NMT, quality of the translation, and robustness against the noises in the training dataset as well as in the testing sentences. In order to test the performance of transformer-based NMT, we used the Google NMT (GNMT) service for 4 languages – Korean, English, German, and Japanese. We confirmed the robustness against sentences with noises. However, we found unexpected volatility of NMT models where the input sentence is semantically and syntactically correct, resulting in critical degradation of translation quality.

목차


1. 서론
2. 신경망 기계번역 기술의 이해
2.1 신경망 기계번역의 학습방법
2.2 자연어처리에서 단어표현 방법인 워드임베딩
2.3 NMT의 부호기-복호기 모델
2.4 주의 기구(Attention Mechanism)
2.5 트랜스포머 NMT
2.6 트랜스포머를 이용한 사전학습 언어모델
3. 데이터세트와 신경망기계학습
3.1 학습 데이터의 양과 품질과 성능
3.2 자원 부족 데이터에 대한 대책
3.3 비지도학습에 의한 소수언어 NMT
4. 신경망 기계번역의 한계점
4.1 문장 단위로 번역하는 NMT 기계번역
4.2 멀티언어 지원
4.3 번역의 견인성과 대책
4.4 비정상적 변동성
5. 번역 데이터 분석
5.1 T-NMT의 개선 내용 검증
5.2 견인성 검증
5.3 동형이의어 번역과 비정상적 변동성 검증
6. 결론
참고문헌

저자정보

  • 지인영 Jhee, In-young. 한국체육대학교
  • 김희동 Kim, Hee-dong. 한국외국어대학교

참고문헌

자료제공 : 네이버학술정보

    함께 이용한 논문

      ※ 기관로그인 시 무료 이용이 가능합니다.

      • 7,500원

      0개의 논문이 장바구니에 담겼습니다.