Loading...
deepQuest: a framework for neural-based quality estimation
Ive, Julia ; Blain, Frederic ; Specia, Lucia
Ive, Julia
Blain, Frederic
Specia, Lucia
Authors
Editors
Other contributors
Affiliation
Epub Date
Issue Date
2018-08
Submitted date
Subjects
Alternative
Abstract
Predicting Machine Translation (MT) quality can help in many practical tasks such as MT post-editing. The performance of Quality Estimation (QE) methods has drastically improved recently with the introduction of neural approaches to the problem. However, thus far neural approaches have only been designed for word and sentence-level prediction. We present a neural framework that is able to accommodate neural QE approaches at these fine-grained levels and generalize them to the level of documents. We test the framework with two sentence-level neural QE approaches: a state of the art approach that requires extensive pre-training, and a new light-weight approach that we propose, which employs basic encoders. Our approach is significantly faster and yields performance improvements for a range of document-level quality estimation tasks. To our knowledge, this is the first neural architecture for document-level QE. In addition, for the first time we apply QE models to the output of both statistical and neural MT systems for a series of European languages and highlight the new challenges resulting from the use of neural MT.
Citation
Ive, J., Blain, F. and Specia, L. (2018) deepQuest: a framework for neural-based quality estimation. In, Proceedings of the 27th International Conference on Computational Linguistics, Bender, E. M., Derczynski, L., Isabelle, P. (eds.), Stroudburg, PA: Association for Computational Linguistics.
Journal
Research Unit
DOI
PubMed ID
PubMed Central ID
Embedded videos
Additional Links
Type
Conference contribution
Language
en
Description
© 2018 The Authors. Published by Association for Computational Linguistics. This is an open access article available under a Creative Commons licence.
The published version can be accessed at the following link on the publisher’s website: https://www.aclweb.org/anthology/C18-1266/
Code available at: https://github.com/sheffieldnlp/deepQuest
Series/Report no.
ISSN
EISSN
ISBN
9781948087506
ISMN
Gov't Doc #
Sponsors
The development of deepQuest received funding from the European Association for Machine Translation and the Amazon Academic Research Awards program. The first author worked on this paper during a research stay at the University of Sheffield.