Seminars and Events

ISI Natural Language Seminar

NL Seminar 1.Leveraging Abstract Meaning Representations to Amplify the Semantic Information Captured in Transformer Models 2.Improving multilingual encoder with contrastive objective and Luna

Event Details

1.Shira Wein – Abstract:
Though state-of-the-art language models perform well on a variety of natural language processing tasks, these models are not exposed to explicit semantic information. We propose that language models’ ability to capture semantic information can be improved through the inclusion of explicit semantic information in the form of meaning representations, thus improving performance on select downstream tasks. We discuss potential ways to incorporate meaning representations and present our preliminary results.

2. Leo Zeyu Liu – Abstract:
Transformers has been successfully adapted to multilingual pretraining. With only token-level losses like masked language model, transformer encoder could produce good token and sentence representations. We propose to explicitly impose sentence-level objectives using contrastive learning to further improve multilingual encoder. Furthermore, we also propose to merge this modification with what a new transformer architecture, Luna, could offer — disentanglement between token and sentence representations. We will also discuss ways to evaluate the models and present our experimental progress.

REMINDER: Meeting hosts only admit guests that they know to the Zoom meeting. Hence, you’re highly encouraged to use your USC account to sign into Zoom. If you’re an outside visitor, please inform nlg DASH seminar DASH admin2 AT isi.edu beforehand so we’ll be aware of your attendance and let you in.\

Speaker Bio

1. Shira Wein - Bio:

Shira Wein is an intern at ISI and a third-year Ph.D. student at Georgetown University, working on semantic representations and multilingual/cross-lingual applications. Her previous work centers around L2 corpora, Abstract Meaning Representations, and information extraction from design documents, which she published on while interning at the Jet Propulsion Lab. Prior to starting her Ph.D., Shira was an undergraduate at Lafayette College, where she received a B.S. in Computer Science and B.A. in Spanish.

2. Leo Zeyu Liu - Bio:

Leo Zeyu Liu is a Master student in Computer Science at the University of Washington, advised by Noah A. Smith and Shane Steinert-Threlkeld. His research aims at interpretability, pretraining, and intersection between NLP and Linguistics. He completed his bachelor in Computer Science at the University of Washington.