Artificial Intelligence

Efficient Representations for Natural Language Processing via Hash Functions

When:
Friday, April 06, 2018, 11:00am - 12:00pm PSTiCal
Where:
6th Floor conference room
This event is open to the public.
Type:
AI Seminar
Speaker:
Sahil Garg, USC-ISI
Video:
https://bluejeans.com/s/hh1ek
Description:

Recent successes of large-scale machine-learning models, such as deep nets, demonstrate both the importance of including a large number of parameters in a model to achieve high flexibility in expressing a given phenomenon and developing efficient, scalable algorithms for learning such models on very large training datasets. In this work, we primarily focus upon kernel methods, along with some contributions generically applicability to neural networks as well. In the literature on kernel similarity functions, it is common to use stationary kernels or non-generic nonstationary kernels, with limited expressive power, partially due to an insufficient number of parameters to capture complex phenomena. In our work, we propose to incorporate additional parameters in a kernel-based model via a generic non-stationary extension. For a scalable yet robust learning of kernel functions, we propose learning algorithms based on stochastic sub-sampling, hash functions, and our novel tight bounds on Shannon information measures. The practical applicability of this work is demonstrated on two real-world tasks: (1) information extraction from the Abstract Meaning Representation (AMR) of text with applications to drug discovery for cancer, (2) dialog modeling in psychotherapy.

Bio: Sahil Garg is a PhD candidate in the Computer Science Department of USC, advised by Prof. Aram Galstyan.

« Return to Events