Authors
Gajbhiye, AmitFomicheva, Marina
Alva-Manchego, Fernando
Blain, Frederic
Obamuyide, Abiola
Aletras, Nikolaos
Specia, Lucia
Issue Date
2021-08-01
Metadata
Show full item recordAbstract
Quality Estimation (QE) is the task of automatically predicting Machine Translation quality in the absence of reference translations, making it applicable in real-time settings, such as translating online social media conversations. Recent success in QE stems from the use of multilingual pre-trained representations, where very large models lead to impressive results. However, the inference time, disk and memory requirements of such models do not allow for wide usage in the real world. Models trained on distilled pre-trained representations remain prohibitively large for many usage scenarios. We instead propose to directly transfer knowledge from a strong QE teacher model to a much smaller model with a different, shallower architecture. We show that this approach, in combination with data augmentation, leads to light-weight QE models that perform competitively with distilled pre-trained representations with 8x fewer parameters.Citation
Gajbhiye, A., Fomicheva, M., Alva-Manchego, F., Blain, F., Obamuyide, A., Aletras, N. & Specia, L. (2021) Knowledge distillation for quality estimation. In: Zong, C., Xia, F., Li, W. and Navigli, R., (eds.) Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021. The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, 01-06 Aug 2021, Bangkok, Thailand (virtual conference). Association for Computational Linguistics (ACL) , pp. 5091-5099.Additional Links
https://2021.aclweb.org/Type
Conference contributionLanguage
enDescription
© 2021 The Authors. Published by ACL. This is an open access article available under a Creative Commons licence. The published version can be accessed at the following link on the publisher’s website: https://aclanthology.org/2021.findings-acl.452ae974a485f413a2113503eed53cd6c53
10.18653/v1/2021.findings-acl.452
Scopus Count
Except where otherwise noted, this item's license is described as https://creativecommons.org/licenses/by/4.0/