Publications
Graph embedding with personalized context distribution
Abstract
Graph representation learning embeds graph nodes in a low-dimensional latent space, which allows for mathematical operations on nodes using low-dimensional vectors for downstream tasks, such as link prediction, node classification, and recommendation. Traditional graph embedding methods rely on hyper-parameters to capture the rich variation hidden in the structure of real-world graphs. In many applications, it may not be computationally feasible to search for optimal hyper-parameters. In this work, built on WatchYourStep which a graph embedding method leveraging graph attention, we propose a method that utilizes node-personalized context attention to capture the local variation in a graph structure. Specifically, we replace the shared context distribution among nodes with learnable personalized context distribution for each node. We evaluate our model on seven real-world graphs and show that our …
- Date
- April 20, 2020
- Authors
- Di Huang, Zihao He, Yuzhong Huang, Kexuan Sun, Sami Abu-El-Haija, Bryan Perozzi, Kristina Lerman, Fred Morstatter, Aram Galstyan
- Book
- Companion Proceedings of the Web Conference 2020
- Pages
- 655-661