Publications

Wan2vec: Embeddings learned on word association norms.

Abstract

Word embeddings are powerful for many tasks in natural language processing. In this work, we learn word embeddings using weighted graphs from word association norms (WAN) with the node2vec algorithm. Although building WAN is a difficult and time-consuming task, training the vectors from these resources is a fast and efficient process. This allows us to obtain good quality word embeddings from small corpora. We evaluate our word vectors in two ways: intrinsic and extrinsic. The intrinsic evaluation was performed with several word similarity benchmarks, WordSim-353, MC30, MTurk-287, MEN-TR-3k, SimLex-999, MTurk-771 and RG-65, and different similarity measures achieving better results than those obtained with word2vec, GloVe, and fastText, trained on a huge corpus. The extrinsic evaluation was done by measuring the quality of sentence embeddings using transfer tasks: sentiment analysis …

Date
November 1, 2019
Authors
Mayank Kejriwal, Vanessa Lopez, Juan F Sequeda, Gemma Bel-Enguix, Helena Gómez-Adorno, Jorge Reyes-Magaña, Gerardo Sierra
Journal
Semantic Web (1570-0844)
Volume
10
Issue
6