Dong Li

Allocating large vocabulary capacity for cross-lingual language model pre-training

ArXiv preprint, abs/2109.07306, 2021.

Zheng, Bo and Dong, Li and Huang, Shaohan and Singhal, Saksham and Che, Wanxiang and Liu, Ting and Song, Xia and Wei, Furu

Allocating large vocabulary capacity for cross-lingual language model pre-training

ArXiv preprint, abs/2109.07306, 2021.

Zheng, Bo and Dong, Li and Huang, Shaohan and Singhal, Saksham and Che, Wanxiang and Liu, Ting and Song, Xia and Wei, Furu

Consistency Regularization for Cross-Lingual Fine-Tuning

Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), 3403--3417, 2021.

Zheng, Bo and Dong, Li and Huang, Shaohan and Wang, Wenhui and Chi, Zewen and Singhal, Saksham and Che, Wanxiang and Liu, Ting and Song, Xia and Wei, Furu

Consistency Regularization for Cross-Lingual Fine-Tuning

Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), 3403--3417, 2021.

Zheng, Bo and Dong, Li and Huang, Shaohan and Wang, Wenhui and Chi, Zewen and Singhal, Saksham and Che, Wanxiang and Liu, Ting and Song, Xia and Wei, Furu