Publications

Unsupervised Dependency Parsing with Transferring Distribution via Parallel Guidance and Entropy Regularization

Abstract

We present a novel approach for inducing unsupervised dependency parsers for languages that have no labeled training data, but have translated text in a resourcerich language. We train probabilistic parsing models for resource-poor languages by transferring cross-lingual knowledge from resource-rich language with entropy regularization. Our method can be used as a purely monolingual dependency parser, requiring no human translations for the test data, thus making it applicable to a wide range of resource-poor languages. We perform experiments on three Data sets—Version 1.0 and version 2.0 of Google Universal Dependency Treebanks and Treebanks from CoNLL shared-tasks, across ten languages. We obtain stateof-the art performance of all the three data sets when compared with previously studied unsupervised and projected parsing systems.

Date
2014
Authors
Xuezhe Ma, Fei Xia
Conference
Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics (ACL 2014)
Volume
1
Pages
1337--1348
Publisher
Association for Computational Linguistics