Universal Neural Machine Translation for Extremely Low Resource Languages

Universal Neural Machine Translation for Extremely Low Resource Languages
Prof. Victor Li
October 5, 2022
Research

In this paper, we propose a new universal machine translation approach focusing on languages with a limited amount of parallel data. Our proposed approach utilizes a transfer-learning approach to share lexical and sentence level representations across multiple source languages into one target language. The lexical part is shared through a Universal Lexical Representation to support multilingual word-level sharing. The sentence-level sharing is represented by a model of experts from all source languages that share the source encoders with all other languages. This enables the low-resource language to utilize the lexical and sentence representations of the higher resource languages. Our approach is able to achieve 23 BLEU on Romanian-English WMT2016 using a tiny parallel corpus of 6k sentences, compared to the 18 BLEU of strong baseline system which uses multilingual training and back-translation. Furthermore, we show that the proposed approach can achieve almost 20 BLEU on the same dataset through fine-tuning a pre-trained multi-lingual system in a zero-shot setting.

Universal Neural Machine Translation for Extremely Low Resource Languages

S.B. (1977), S.M. (1979), E.E. (1980), Sc.D. (1981), EECS, MIT. Chair Professor in Information Engineering, HKU. Fellow of IEEE, HKIE, IAE, and HK Academy of Engineering Sciences. Awardee of the Bronze Bauhinia Star, Government of HKSAR.