TransQuest: Translation quality estimation with cross-lingual transformers
Abstract
Recent years have seen big advances in the field of sentence-level quality estimation (QE), largely as a result of using neural-based architectures. However, the majority of these methods work only on the language pair they are trained on and need retraining for new language pairs. This process can prove difficult from a technical point of view and is usually computationally expensive. In this paper we propose a simple QE framework based on cross-lingual transformers, and we use it to implement and evaluate two different neural architectures. Our evaluation shows that the proposed methods achieve state-of-the-art results outperforming current open-source quality estimation frameworks when trained on datasets from WMT. In addition, the framework proves very useful in transfer learning settings, especially when dealing with low-resourced languages, allowing us to obtain very competitive results.Citation
Ranasinghe, T., Orasan, C. and Mitkov, R. (2020) TransQuest: Translation quality estimation with cross-lingual transformers. In Proceedings of the 28th International Conference on Computational Linguistics, pages 5070–5081, Barcelona, Spain (Online). International Committee on Computational Linguistics.Journal
Proceedings of the 28th International Conference on Computational LinguisticsAdditional Links
https://aclanthology.org/2020.coling-main.445/Type
Conference contributionLanguage
enDescription
© 2020 The Authors. Published by International Committee on Computational Linguistics. This is an open access article available under a Creative Commons licence. The published version can be accessed at the following link on the publisher’s website: https://aclanthology.org/2020.coling-main.445/ISBN
9781952148279ae974a485f413a2113503eed53cd6c53
10.18653/v1/2020.coling-main.445
Scopus Count
Except where otherwise noted, this item's license is described as https://creativecommons.org/licenses/by/4.0/