Loading...
Cross-lingual transfer learning and multitask learning for capturing multiword expressions
Taslimipoor, Shiva ; Rohanian, Omid ; Ha, Le An
Taslimipoor, Shiva
Rohanian, Omid
Ha, Le An
Editors
Other contributors
Affiliation
Epub Date
Issue Date
2019-08-31
Submitted date
Alternative
Abstract
Recent developments in deep learning have prompted a surge of interest in the application of multitask and transfer learning to NLP problems. In this study, we explore for the first time, the application of transfer learning (TRL) and multitask learning (MTL) to the identification of Multiword Expressions (MWEs). For MTL, we exploit the shared syntactic information between MWE and dependency parsing models to jointly train a single model on both tasks. We specifically predict two types of labels: MWE and dependency parse. Our neural MTL architecture utilises the supervision of dependency parsing in lower layers and predicts MWE tags in upper layers. In the TRL scenario, we overcome the scarcity of data by learning a model on a larger MWE dataset and transferring the knowledge to a resource-poor setting in another language. In both scenarios, the resulting models achieved higher performance compared to standard neural approaches.
Citation
Taslimipoor, S., Rohanian, O. and Ha, L.A. (2019) Cross-lingual transfer learning and multitask learning for capturing multiword expressions, Proceedings of the Joint Workshop on Multiword Expressions and WordNet (MWE-WN 2019), 2nd August 2019, Florence, Italy, pp. 155–161.
Journal
Research Unit
PubMed ID
PubMed Central ID
Embedded videos
Additional Links
Type
Conference contribution
Language
en
Description
This is an accepted manuscript of an article published by Association for Computational Linguistics in Proceedings of the Joint Workshop on Multiword Expressions and WordNet (MWE-WN 2019), available online: https://www.aclweb.org/anthology/W19-5119
The accepted version of the publication may differ from the final published version.
Series/Report no.
ISSN
EISSN
ISBN
9781950737260