Cross-lingual transfer learning and multitask learning for capturing multiword expressions
Abstract
Recent developments in deep learning have prompted a surge of interest in the application of multitask and transfer learning to NLP problems. In this study, we explore for the first time, the application of transfer learning (TRL) and multitask learning (MTL) to the identification of Multiword Expressions (MWEs). For MTL, we exploit the shared syntactic information between MWE and dependency parsing models to jointly train a single model on both tasks. We specifically predict two types of labels: MWE and dependency parse. Our neural MTL architecture utilises the supervision of dependency parsing in lower layers and predicts MWE tags in upper layers. In the TRL scenario, we overcome the scarcity of data by learning a model on a larger MWE dataset and transferring the knowledge to a resource-poor setting in another language. In both scenarios, the resulting models achieved higher performance compared to standard neural approaches.Citation
Taslimipoor, S., Rohanian, O. and Ha, L.A. (2019) Cross-lingual transfer learning and multitask learning for capturing multiword expressions, Proceedings of the Joint Workshop on Multiword Expressions and WordNet (MWE-WN 2019), 2nd August 2019, Florence, Italy, pp. 155–161.Additional Links
https://www.aclweb.org/anthology/W19-5119/Type
Conference contributionLanguage
enDescription
This is an accepted manuscript of an article published by Association for Computational Linguistics in Proceedings of the Joint Workshop on Multiword Expressions and WordNet (MWE-WN 2019), available online: https://www.aclweb.org/anthology/W19-5119 The accepted version of the publication may differ from the final published version.ISBN
9781950737260ae974a485f413a2113503eed53cd6c53
10.18653/v1/W19-5119
Scopus Count
Except where otherwise noted, this item's license is described as https://creativecommons.org/licenses/by/4.0/