Seminars and Events

ISI Natural Language Seminar

Enhancing Machine Translation with Large Language Models via Optimizing In Context Examples and Dictionary Based Prompting

Event Details

REMINDER:*This Talk will be a Live Broadcast Only, It “Will Not” be recorded.*

Meeting hosts only admit guests that they know to the Zoom meeting. Hence, you’re highly encouraged to use your USC account to sign into Zoom.

If you’re an outside visitor, please inform us at (nlg-seminar-host(at)isi.edu) beforehand so we’ll be aware of your attendance and let you in.

In-person attendance will be permitted for USC/ISI faculty, staff, students only. Open to the public virtually via the zoom link and online.

For more information on the NL Seminar series and upcoming talks, please visit:

https://nlg.isi.edu/nl-seminar/ 

Large language models LLMs have revolutionized natural language processing by demonstrating impressive abilities to perform a wide range of tasks, including machine translation MT. However, the quality and domain of the in-context examples used to prompt these models can significantly impact their performance for specific tasks. In this talk, I will discuss two recent papers that propose to optimize in-context examples and leverage bilingual dictionaries to enhance the quality and controllability of MT with LLMs. First, I will explore the impact of in-context examples on the translation quality of LLMs and highlight the challenges of selecting good examples in both in-domain and out-of-domain settings. Then, I will discuss how we can  leverage bilingual dictionaries to provide fine-grained phrase-level control hints in the prompts of LLMs.

Speaker Bio

Marjan Ghazvininejad is a senior research scientist at Facebook AI Research. She received her Ph.D. at the University of Southern California on neural creative language generation. Her research interests include text representation, language generation, and machine translation. Her recent research has focused on how to optimize the use of large language models in various applications.