Analogy Training Multilingual Encoderss
Research output: Chapter in Book/Report/Conference proceeding › Article in proceedings › Research › peer-review
Documents
- Analogy Training Multilingual Encoders∗
Final published version, 279 KB, PDF document
Language encoders encode words and phrases in ways that capture their local semantic relatedness, but are known to be globally inconsistent. Global inconsistency can seemingly be corrected for, in part, by leveraging signals from knowledge bases, but previous results are partial and limited to monolingual English encoders. We extract a large-scale multilingual, multi-word analogy dataset from Wikidata for diagnosing and correcting for global inconsistencies, and then implement a four-way Siamese BERT architecture for grounding multilingual BERT (mBERT) in Wikidata through analogy training. We show that analogy training not only improves the global consistency of mBERT, as well as the isomorphism of language-specific subspaces, but also leads to consistent gains on downstream tasks such as bilingual dictionary induction and sentence retrieval.
Original language | English |
---|---|
Title of host publication | Proceedings of the AAAI-21 International Joint Conference on Artificial Intelligence |
Number of pages | 10 |
Publisher | AAAI Press |
Publication date | 2021 |
Pages | 12884-12892. |
ISBN (Electronic) | 978-1-57735-866-4 |
Publication status | Published - 2021 |
Event | 35th AAAI Conference on Artificial Intelligence - Virtual Duration: 2 Feb 2021 → 9 Feb 2021 |
Conference
Conference | 35th AAAI Conference on Artificial Intelligence |
---|---|
By | Virtual |
Periode | 02/02/2021 → 09/02/2021 |
Series | Proceedings of the International Joint Conference on Artificial Intelligence |
---|---|
Number | 14 |
Volume | 35 |
ISSN | 1045-0823 |
Number of downloads are based on statistics from Google Scholar and www.ku.dk
No data available
ID: 300671526