FILTER: An Enhanced Fusion Method for Cross-lingual Language Understanding
Abstract
Large-scale <PRE_TAG>cross-lingual language models (LM)</POST_TAG>, such as <PRE_TAG><PRE_TAG>mBERT</POST_TAG></POST_TAG>, <PRE_TAG><PRE_TAG>Unicoder</POST_TAG></POST_TAG> and <PRE_TAG><PRE_TAG>XLM</POST_TAG></POST_TAG>, have achieved great success in <PRE_TAG><PRE_TAG>cross-lingual representation learning</POST_TAG></POST_TAG>. However, when applied to <PRE_TAG>zero-shot cross-lingual transfer</POST_TAG> tasks, most existing methods use only <PRE_TAG>single-language input</POST_TAG> for <PRE_TAG>LM finetuning</POST_TAG>, without leveraging the <PRE_TAG>intrinsic cross-lingual alignment</POST_TAG> between different languages that proves essential for <PRE_TAG>multilingual tasks</POST_TAG>. In this paper, we propose <PRE_TAG>FILTER</POST_TAG>, an enhanced fusion method that takes <PRE_TAG>cross-lingual data</POST_TAG> as input for <PRE_TAG>X<PRE_TAG>LM finetuning</POST_TAG></POST_TAG>. Specifically, <PRE_TAG>FILTER</POST_TAG> first encodes text input in the source language and its translation in the target language independently in the <PRE_TAG>shallow layers</POST_TAG>, then performs <PRE_TAG>cross-language fusion</POST_TAG> to extract <PRE_TAG>multilingual knowledge</POST_TAG> in the <PRE_TAG>intermediate layers</POST_TAG>, and finally performs further <PRE_TAG>language-specific encoding</POST_TAG>. During <PRE_TAG>inference</POST_TAG>, the model makes predictions based on the text input in the target language and its translation in the source language. For simple tasks such as classification, translated text in the target language shares the same label as the source language. However, this <PRE_TAG>shared label</POST_TAG> becomes less accurate or even unavailable for more complex tasks such as <PRE_TAG>question answering</POST_TAG>, <PRE_TAG>NER</POST_TAG> and POS tagging. To tackle this issue, we further propose an additional <PRE_TAG>KL-divergence self-teaching loss</POST_TAG> for model training, based on auto-generated soft pseudo-labels for translated text in the target language. Extensive experiments demonstrate that <PRE_TAG>FILTER</POST_TAG> achieves new state of the art on two challenging <PRE_TAG>multilingual multi-task benchmarks</POST_TAG>, <PRE_TAG>XTREME</POST_TAG> and <PRE_TAG>XGLUE</POST_TAG>.
Models citing this paper 2
Datasets citing this paper 0
No dataset linking this paper
Spaces citing this paper 7
Collections including this paper 0
No Collection including this paper