Analogy Training Multilingual Encoderss
Research output: Chapter in Book/Report/Conference proceeding › Article in proceedings › Research › peer-review
Standard
Analogy Training Multilingual Encoderss. / Garneau, Nicolas ; lwp876, lwp876; Sandholm, Anders; Ruder, Sebastian ; Vulić, Ivan; Søgaard, Anders.
Proceedings of the AAAI-21 International Joint Conference on Artificial Intelligence. AAAI Press, 2021. p. 12884-12892. (Proceedings of the International Joint Conference on Artificial Intelligence; No. 14, Vol. 35).Research output: Chapter in Book/Report/Conference proceeding › Article in proceedings › Research › peer-review
Harvard
APA
Vancouver
Author
Bibtex
}
RIS
TY - GEN
T1 - Analogy Training Multilingual Encoderss
AU - Garneau, Nicolas
AU - lwp876, lwp876
AU - Sandholm, Anders
AU - Ruder, Sebastian
AU - Vulić, Ivan
AU - Søgaard, Anders
PY - 2021
Y1 - 2021
N2 - Language encoders encode words and phrases in ways that capture their local semantic relatedness, but are known to be globally inconsistent. Global inconsistency can seemingly be corrected for, in part, by leveraging signals from knowledge bases, but previous results are partial and limited to monolingual English encoders. We extract a large-scale multilingual, multi-word analogy dataset from Wikidata for diagnosing and correcting for global inconsistencies, and then implement a four-way Siamese BERT architecture for grounding multilingual BERT (mBERT) in Wikidata through analogy training. We show that analogy training not only improves the global consistency of mBERT, as well as the isomorphism of language-specific subspaces, but also leads to consistent gains on downstream tasks such as bilingual dictionary induction and sentence retrieval.
AB - Language encoders encode words and phrases in ways that capture their local semantic relatedness, but are known to be globally inconsistent. Global inconsistency can seemingly be corrected for, in part, by leveraging signals from knowledge bases, but previous results are partial and limited to monolingual English encoders. We extract a large-scale multilingual, multi-word analogy dataset from Wikidata for diagnosing and correcting for global inconsistencies, and then implement a four-way Siamese BERT architecture for grounding multilingual BERT (mBERT) in Wikidata through analogy training. We show that analogy training not only improves the global consistency of mBERT, as well as the isomorphism of language-specific subspaces, but also leads to consistent gains on downstream tasks such as bilingual dictionary induction and sentence retrieval.
M3 - Article in proceedings
T3 - Proceedings of the International Joint Conference on Artificial Intelligence
SP - 12884-12892.
BT - Proceedings of the AAAI-21 International Joint Conference on Artificial Intelligence
PB - AAAI Press
Y2 - 2 February 2021 through 9 February 2021
ER -
ID: 300671526