CLICKER: Attention-based cross-lingual commonsense knowledge transfer
2023
Recent advances in cross-lingual commonsense reasoning (CSR) are facilitated by the development of multilingual pre-trained models (mPTMs). While mPTMs show the potential to encode commonsense knowledge for different languages, transferring commonsense knowledge learned in large-scale English corpus to other languages is challenging. To address this problem, we propose an attentionbased Cross-LIngual Commonsense Knowledge transfER (CLICKER) framework for minimizing the performance gaps between English and non-English languages on commonsense question-answering tasks. CLICKER can effectively improve commonsense reasoning for non-English languages by differentiating language-specific knowledge from commonsense knowledge. Experimental results on public benchmarks demonstrate that CLICKER is effective in decoupling commonsense knowledge from non-commonsense knowledge and achieving remarkable improvements in the cross-lingual CSR task of languages other than English.
Research areas