earticle

논문검색

Session 1-2. Technology and T&I Education

Research on Evaluation for Interpretation and Translation Education : Focused on Evaluation Platforms for Translation Assignments

초록

영어

The purpose of this research is to create an effective evaluation system that can be proven to aptly respond to on and off line needs, thereby overcoming the limits found in Learning Service Providers(LSP) of higher education institutions and remote conference apps, such as Zoom, used for on-line interpretation and translation education. With the foresaid in mind, the study takes off with discussing the creation of a translation platform, followed by perfecting the platform by remedying its shortcoming through professor-learner trials. Lastly, it proceeds to discussing the possibility of expanding the range of said evaluation platform to cover interpretation education. The translation platform consists of displaying an general-purpose machine translation program integrating a CAT tool format. It displays the original text on the left and the translated text on the right, which setting is familiar to both professors and learners. Learners would receive an original text as an assignment to translate. After completing and entering their translations, they would press the "SUBMIT" button to hand their assignments over to the professor. The professor would then evaluate the handed in translations, comparing it with the original text. Evaluations are made to appear in the form of blocks, on both the original text and translated text, following designation. It also displays in real time the professor's evaluations on set categories in the form of a graph, which appears as evaluations for the designated category are entered. Evaluation categories can be selected through drop-down or by directly entering the category's name. Learners can see the general review on their translations, once the professor presses the "COMPLETE" button. The arrow buttons can be used to observe and compare evaluations on the learner's translation in consecutive order. A total of 14 professors and 8 learners were recruited to test out the above mentioned platform. Learners completed and sent AB and BA translations pertaining to the field of technology and literature within 60 minutes, which was then evaluated by professors using the platform. Trial results showed that both professors and learners found the platform to be objective and easy to use, with noted compliments on the convenience of having both the task of "submitting" and "evaluating" assignments in one platform, as well as on the fact of being able to see individual translation tendencies through the visual and graphic display evaluation results. Professors found it useful for delivering quantitative feedbacks following translation evaluations, which tend to be explained during class, adding that it would greatly save time once that one gets used to it. Nevertheless, they pointed out that the "scroll" and "correct" functions needed to be improved, which flaws were inevitably present in the pilot version of said platform, and that it was difficult to select the adequate evaluation category via drop-down, for the presented category names were somewhat confusing. Learners stated that feedbacks sorted in categories were of great help, showing a high level of satisfaction. However, professors were concerned by the possibility that the use of said platform could lead learners into a habit of over relying on 1 to 1 translation correspondence. Future tasks for the developing of the platform would involve the review of evaluation category selection, free input of category name, improvement of platform functions, incorporation of "peer-review" options, and follow up trials to enhance user friendliness. This platform could be further developed into an interpretation platform, once that certain functions, such as "voice data upload" and "transcription for interpretation performance", become available.

목차

Abstract
Author Introduction

저자정보

  • Juriae Lee Ewha Graduate School of Interpretation & Translation, Korea
  • Hyunseok Park Ewha Womans University
  • Uran Oh Ewha Womans University
  • Haekyung Park Ewha Graduate School of Interpretation & Translation
  • Jibong Son Ewha Graduate School of Interpretation & Translation
  • Jindong Kim Database Center for Life Science (DBCLS)

참고문헌

자료제공 : 네이버학술정보

    함께 이용한 논문

      ※ 기관로그인 시 무료 이용이 가능합니다.
      ※ 학술발표대회집, 워크숍 자료집 중 4페이지 이내 논문은 '요약'만 제공되는 경우가 있으니, 구매 전에 간행물명, 페이지 수 확인 부탁 드립니다.

      • 3,000원

      0개의 논문이 장바구니에 담겼습니다.