Learn to cross-lingual transfer with meta graph learning across heterogeneous languages
2020
The recent emergence of multilingual pretraining language model (mPLM) has enabled breakthroughs on various downstream crosslingual transfer (CLT) tasks. However, mPLMbased methods usually involve two problems: (1) simply fine-tuning may not adapt generalpurpose multilingual representations to be task-aware on low-resource languages; (2) ignore how cross-lingual adaptation happens for downstream tasks. To address the issues, we propose a meta graph learning (MGL) method. Unlike prior works that transfer from scratch, MGL can learn to cross-lingual transfer by extracting meta-knowledge from historical CLT experiences (tasks), making mPLM insensitive to low-resource languages. Besides, for each CLT task, MGL formulates its transfer process as information propagation over a dynamic graph, where the geometric structure can automatically capture intrinsic language relationships to guide cross-lingual transfer explicitly. Empirically, extensive experiments on both public and real-world datasets demonstrate the effectiveness of the MGL method.
Research areas