The Natural Language Group

Summer Internships

Summer 2021 Internships in Natural Language Processing

We are looking for interested and qualified students (graduate and undergraduate) to spend the summer working with ongoing research projects at USC/ISI on natural language processing, machine learning, statistical modeling, machine translation, creative language generation, and other areas.

These are paid internships. They will be available for a three month (12 week) period during the summer of 2021. The internships will, if possible, be held in Marina del Rey, however due to COVID-19 restrictions they may be virtual internships. If virtual, interns must nevertheless reside in the United States during the internship.

Good programming skills are required, but prior experience in natural language processing is not necessarily required. We will provide tutorials on relevant topics at the beginning of the summer.

Important dates

  • 2021 Jan 29 Applications due (Beware: this date may be earlier than the date listed on the general ISI internship page).
  • 2021 Feb 27 (approx.) First acceptance notifications. The procedure may last until the end of March. We are unable to respond to requests for updates.
  • 2021 June 1 Internships begin

How to Apply

Please follow this link. You will be required to submit a statement and provide email addresses of up to three people who will write letters of recommendation. 


Project Areas of Interest

  1. Low Resource Neural Machine Translation. Can we use transfer learning, curriculum learning, and active learning so that machine translation quality can be reached with volumes of training data equal to that observed by a human second language learner? We will follow our previous work as well as the latest and greatest approaches.
  2. Creative Dialogue. Following on from our recent work, can we use inspiration from improv comedy, soap operas, and other sources to improve creativity, grounding, and fluency in dialogue systems?  
  3. Commonsense Reasoning.  A simple story: Janice got into her car and sped off.  Question: Did she press on the accelerator?  An AI system needs to know a lot to answer simple questions like this.  Can an AI system obtain such knowledge by reading text?
  4. Information Extraction. There is abundant knowledge carried in the exponentially expanding corpora of natural language texts. Yet this knowledge is mostly inaccessible to computers and overwhelming for human experts to absorb. From a strong foundation, we want to build knowledge graphs to dramatically increase the accessibility of knowledge through search engines, interactive AI agents, and medical research tools.
  5. Robust Interlingual Representations. Encoding various languages into a unified semantic meaning space, a.k.a an interlingual representation, is a promising direction to break the language barriers for NLP technologies. We'd like to develop a framework that incorporates advances in multilingual machine translation, pre-trained language modeling, and robust representation learning to learn interlingual representations that are robust with respect to the variety of languages, imbalanced data distributions, and language-specific data biases.
  6. Meta-Learning for Cross-Lingual Model Transfer. A number of technologies in large-scale pre-trained language modeling, such as BERT, mBERT, XLNet and XLM, have led to impressive successes in (interlingual) representation learning. However, how to effectively apply the learned interlingual representations to downstream tasks has largely remained under-appreciated. We'd like to investigate applying meta-learning methods to transfer the learned interlingual representations into low-resource languages to enhance multilingual NLP on a broad range of applications, including named entity recognition, coreference resolution, and machine translation. In particular, we are interested in transferring models/representations to new languages and/or domains in few-shot or even zero-shot scenarios.
  7. Understanding event processes. Natural language always communicates about events, and events often connect into processes due to some central goal. Given the event process "fulfilling course requirements" -> "passing qualification exams" -> "publish papers" -> "doing internships" -> "defend dissertation", does a machine understand that it leads to the central goal of "earning a degree"? And how do we efficiently teach the machine to understand the salience of events, i.e. that "defending dissertation" is much more important than "doing internships"? Does such knowledge help downstream tasks like summarization?
  8. Open book question answering. Open book question answering challenges machines because it needs the machine to find the specific answer to a question throughout the entire book. However, can we think about a way to refine the knowledge in the book? For example, can we use a really good summarization model to summarize the chapters of the book and preserve only the salient information, from where the machine can find the answer from significantly less content? In the same context, how do we teach machines to understand when sometimes there is no answer to a question?


Research Environment

Summer internship projects are supervised by Jonathan May, Xuezhe Ma, and Muhao Chen. Interns also interact and collaborate closely with the rest of ISI's Natural Language Group. Our group's research environment includes weekly seminars and reading groups, opportunities for teaching and advising, an active program for summer students, large quantities of linguistic resources, and a supercomputing cluster completely dedicated to natural language research at USC/ISI.

USC/ISI is an academic research institute that is part of USC's Viterbi School of Engineering; many USC/ISI scientists hold research faculty positions in the Computer Science Department. The Natural Language Group is part of USC/ISI's Artificial Intelligence Division which carries out a wide range of artificial intelligence research.

USC/ISI is located in Marina del Rey on the Southern California coast, an excellent location convenient to beaches, restaurants, boating, bike paths, and shopping. Note: we are not located on the main campus of USC, which is near downtown LA.

Past Interns

Our summer program is well established! Past students are listed below. Several students (marked *) interned twice, and several (marked ^) joined ISI later as a PhD student, visiting PhD student, or research scientist.

  • 2020: Omar Shaikh (Georgia Tech), Ugur Yavuz (Dartmouth College), Weiqiu You (University of Pennsylvania), Naitian Zhou (University of Michigan)
  • 2019: Justin Cho^ (Hong Kong Univ. of Science and Technology), Denis Emelin (Edinburgh), Zhifeng Hu (Fudan Univ.) Angelina McMillan-Major (UW), Prince Wang (UCSB), Shufan Wang (UMass)
  • 2018: Ronald Cardenas (Charles University in Prague), Mozhdeh Gheini (USC), Xiaolei Huang (Univ. of Colorado), Allison Limke (Wartburg College), James Mullenbach (Georgia Tech),  Xinyu Wang (CMU)
  • 2017: Yining Chen (Dartmouth), Leon Cheung (UCSD), Sorcha Gilroy (Edinburgh), Nelson Liu (UW), Alexandra (Sasha) Mayn (Carleton College)
  • 2016: Nada Aldarrab^ (USC), Angeliki Laziradou (U. Trento), Xiang Li (U. Chicago), Sebastian Mielke (Dresden Univ. Technology), Ke Tran (U. Amsterdam)
  • 2015: Callum O'Shaughnessy (Queens University), Sudha Rao (Maryland), Wenduan Xu (Cambridge), Barret Zoph (USC)
  • 2014: Julian Schamper (Aachen), Eunsol Choi (Washington), Allen Schmaltz (Harvard), Matic Horvat (Cambridge)
  • 2013: Daniel Bauer* (Columbia), Fabienne Braune (Stuttgart), Jackie Lee (MIT), Elliot Meyerson (Wesleyan), Arvind Neelakantan (Columbia/UMass), Malte Nuhn (Aachen)
  • 2012: Jacob Andreas (Columbia), Daniel Bauer (Columbia), Karl Moritz Hermann (Oxford), Bevan Jones (Edinburgh/Macquarrie), Nathan Schneider (CMU), Ada Wan (CUNY).
  • 2011: Licheng Fang (Rochester), Sravana Reddy* (Chicago), Xuchen Yao (JHU).
  • 2010: Yoav Goldberg (Ben Gurion, Israel), Ann Irvine (Hopkins), Sravana Reddy (Chicago), Alexander "Sasha" Rush (MIT).
  • 2009: Michael Auli (University of Edinburgh), Paramveer Dhillon (Penn), Erica Greene^ (Haverford), Adam Pauls (UC Berkeley)
  • 2008: Amittai Axelrod (University of Washington), John DeNero (UC Berkeley), Kyle Gorman (Penn Linguistics), Catalin Tirnauca (Universitat Rovira i Virgili)
  • 2007: Michael Bloodgood (Delaware), Jennifer Gillenwater (Rice University), Carmen Heger (Dresden), Wei Ho (Princeton).
  • 2006: Joseph Turian (NYU), Chenhai Xi (Pitt), Victoria Fossum*^ (Michigan), Liang Huang*^ (Penn), Jason Riesa*^ (JHU), Oana-Diana Postolache^ (Saarland).
  • 2005: Victoria Fossum (Michigan), Mark Hopkins* (UCLA), Liang Huang (Penn), Behrang Mohit (Pitt), Preslav Nakov (Berkeley), Jason Riesa (JHU), Hao Zhang (Rochester).
  • 2004: Madhur Ambastha (Rochester), Michel Galley* (Columbia), David Kauchak (UCSD).
  • 2003: Michel Galley (Columbia), Mark Hopkins (UCLA), Beata Klebanov (Hebrew University), Ana-Maria Popescu (University of Washington), Lara Taylor (UCSD).
  • 2002: Chris Ackerman (USC), Emil Ettelaie (USC), Yuling Hsueh (USC), John Lee (Waterloo/MIT), Bo Pang (Cornell)
  • 2001: Abdessamad Echihabi (USC), Hal Daume III^ (CMU), Michael Laszlo (Waterloo), Dragos Stefan Munteanu^ (Iowa), Rebecca Rees (BYU), Radu Soricut^ (Iowa)
  • 1994-2000: Estibaliz Amorrortu, Vasileios Hatzivassiloglou (Columbia), Michael Jahr (Stanford), Larry Kite (USC), Magdalena Romera (USC), Maki Watanabe (USC).

Intern Publications

We always aim to solve interesting and novel scientific problems, and to publish the results in the best conferences. Sample papers that have come from past student internships:

  • "Do Nuclear Submarines Have Nuclear Captains? A Challenge Dataset for Commonsense Reasoning over Adjectives and Objects", (J. Mullenbach, J. Gordon, N. Peng and J. May), Proc. EMNLP, 2019.
  • "What Matters for Neural Cross-Lingual Named Entity Recognition: An Empirical Analysis", (X. Huang, J. May and N. Peng), Proc. EMNLP, 2019.
  • "A Grounded Unsupervised Universal Part-of-Speech Tagger for Low-Resource Languages", (R. Cardenas, Y. Lin, H. Ji, J. May), Proc. NAACL, 2019.
  • "Recurrent Neural Networks as Weighted Language Recognizers" (Y. Chen, S. Gilroy, A. Maletti, J. May, and K. Knight), Proc. NAACL, 2018. Outstanding Paper Award.
  • "Biomedical Event Extraction using Abstract Meaning Representation" (S. Rao, D. Marcu, K. Knight, and H. Daume), Proc. BioNLP Workshop, ACL, 2017.
  • "Unsupervised Neural Hidden Markov Models" (K. Tran, Y. Bisk, A. Vaswani, D. Marcu, and K. Knight), Proceedings of the EMNLP Workshop on Structured Prediction, 2016.
  • "Multi-Source Neural Translation" (B. Zoph and K. Knight), Proceedings of NAACL 2016.
  • "Extracting Structured Scholarly Information from the Machine Translation Literature" (E. Choi, M. Horvat, J. May, K. Knight, D. Marcu), Proceedings of LREC 2016.
  • "Cipher Type Detection" (Malte Nuhn and Kevin Knight), Proceedings of EMNLP 2014.
  • "Mapping between English Strings and Reentrant Semantic Graphs" (F. Braune, D. Bauer, and K. Knight), Proceedings of LREC 2014.
  • "Parsing Graphs with Hyperedge Replacement Grammars" (D. Chiang, J. Andreas, D. Bauer, K.-M. Hermann, B. Jones and K. Knight), Proceedings of ACL 2013.
  • "Learning Whom to Trust with MACE" (D. Hovy, T. Berg-Kirkpatrick, A. Vaswani, and E. Hovy), Proceedings of NAACL 2013.
  • "Semantics-Based Machine Translation with Hyperedge Replacement Grammars" (B. Jones, J. Andreas, D. Bauer, K.-M. Hermann, K. Knight), Proceedings of COLING 2012.
  • "Feature-Rich Language-Independent Syntax-Based Alignment for Statistical Machine Translation" (J. Riesa, A. Irvine, D. Marcu), Proceedings of EMNLP 2011.
  • "Language-independent parsing with empty elements" (S. Cai, D. Chiang, Y. Goldberg), Proceedings of ACL 2011.
  • "Automatic Analysis of Rhythmic Poetry with Applications to Generation and Translation" (E. Greene, T. Bodrumlu, K. Knight), Proceedings of EMNLP 2010.
  • "Efficient optimization of an MDL-inspired objective function for unsupervised part-of-speech tagging" (A. Vaswani, A. Pauls, D. Chiang), Proceedings of ACL 2010.
  • "Unsupervised Syntactic Alignment with Inversion Transduction Grammars" (A. Pauls, D. Klein, D. Chiang, K. Knight), Proceedings of NAACL 2010.
  • "Bayesian Inference for Finite-State Transducers" (D. Chiang, J. Graehl, K. Knight, A. Pauls, S. Ravi), Proceedings of NAACL 2010.
  • "Binarization of Synchronous Context-Free Grammars" (L. Huang, H. Zhang, D. Gildea, K. Knight), Computational Linguistics, 2009.
  • "Fast Consensus Decoding over Translation Forests" (J. DeNero, D. Chiang, and K. Knight). Proceedings of ACL 2009.
  • "Forest Rescoring: Faster Decoding with Integrated Language Models" (L. Huang and D. Chiang), Proceedings of ACL 2007.
  • "Scalable Inference and Training of Context-Rich Syntactic Models" (M. Galley, J. Graehl, K. Knight, D. Marcu, S. DeNeefe, W. Wang, and I. Thayer), Proceedings of ACL 2006, poster session.
  • "Synchronous Binarization for Machine Translation" (H. Zhang, L. Huang, D. Gildea, K. Knight), Proceedings of NAACL 2006.
  • "Statistical Syntax-Directed Translation with Extended Domain of Locality" (L. Huang, K. Knight, A. Joshi), Proceedings of the Conference of the Association for Machine Translation in the Americas (AMTA-06).
  • "Building an English-Iraqi Arabic Machine Translation System for Spoken Utterances with Limited Resources" (J. Riesa, B. Mohit, K. Knight, D. Marcu), Proceedings of Interspeech 2006.
  • "Text Simplification for Information Seeking Applications" (B. Beigman Klebanov, K. Knight, D. Marcu), In: On the Move to Meaningful Internet Systems, eds. R. Meersman and Z. Tari, Lecture Notes in Computer Science (3290), Springer-Verlag, 2004.
  • "What's in a Translation Rule?" (M. Galley, M. Hopkins, K. Knight, D. Marcu), Proceedings of NAACL 2004.
  • "Syntax-based Alignment of Multiple Translations: Extracting Paraphrases and Generating New Sentences" (B. Pang, K. Knight, and D. Marcu), Proceedings of NAACL 2003.
  • "Using a Large Monolingual Corpus to Improve Translation Accuracy" (R. Soricut, K. Knight, and D. Marcu), Proceedings of the 6th Association for Machine Translation in the Americas Conference (AMTA-2002).
  • "Processing Comparable Corpora With Bilingual Suffix Trees" (D. Munteanu and D. Marcu), Proceedings of EMNLP 2002.
  • "A Noisy-Channel Model for Document Compression" (H. Daume III and D. Marcu), Proceedings of ACL 2002.
  • "An Unsupervised Approach to Recognizing Discourse Relations" (D. Marcu and A. Echihabi), Proceedings of ACL 2002.
  • "Fast Decoding and Optimal Decoding for Machine Translation" (U. Germann, M. Jahr, K. Knight, D. Marcu, and K. Yamada), Proceedings of ACL 2001. ACL Best Paper award.
  • "An Empirical Study in Multilingual Natural Language Generation: What Should a Text Planner Do?" (D. Marcu, L. Carlson, and M. Watanabe), The 1st International Conference on Natural Language Generation INLG'2000, Mitzpe Ramon, Israel, 2000.
  • "Experiments in Constructing a Corpus of Discourse Trees" (D. Marcu, E. Amorrortu, and M. Romera), ACL'99 Workshop on Standards and Tools for Discourse Tagging, Univ. Maryland, 1999.
  • "Two-Level, Many-Paths Generation," (K. Knight and V. Hatzivassiloglou), Proceedings of ACL 1995.

Frequently Asked Questions

Q: Should I include anything in my application in addition to a statement of purpose and CV (e.g. sample publications, awards, certificates, etc.)?

A: No. We will discard, unread, any supplemental material. We only read your statement of purpose, CV, and letters sent by your recommenders.

Q: Is the salary enough for a decent life in westside LA? What will the exact salary be?

A: Yes, of course! Our internship compensation is competitive with industrial internships. Housing is generally expensive in this area (because it's safe, beautiful, and close to the ocean), but definitely affordable with the salary we offer. The exact amount is yet to be determined (and will be stated on the offer letter), but again it will be enough for a decent life for 3 months.

Q: Where will I live during the internship?

A: Apartments in Marina del Rey proper can be very expensive but short-term rentals are often available in Palms, Culver City, Del Rey, Venice, and Santa Monica (these are names of neighborhoods and/or towns nearby and will help your search). You might want to consider teaming up with other interns; we will put you in touch. Increasingly, interns find housing near USC's main campus and take the free daily shuttle in to ISI. The method of finding housing changes over time; as of this writing it is frequently done via social media platforms: try looking for 'USC housing' groups on Facebook or WeChat.  

Q: During the internship, can I go to a conference for a week or so? Or a short vacation?

A: Conferences are definitely OK, especially when you have a paper there, but in any case there should be at least 12 weeks of work here (otherwise it's hard to get anything sizable done). We generally discourage vacations over a week during the internship.

Q: My summer break does not line up with your schedule. Will you still consider my application?

A: We can accommodate early/late arrivals/departures of no more than two weeks, as long as you complete 12 weeks of work here (see above). This should be sufficient to accommodate US semester and quarter systems.

Q: Can I keep working on the projects after going back to my own school?

A: In general yes, especially when you are writing up a paper on the topic. Most likely you will be logging in remotely to work on our machines.

Q: Can I survive without a car here?

A: For three months, definitely yes. Many of our past interns did not own a car while here, and they either bike or take a bus to ISI. Unlike other parts of LA, we do have reliable buslines systems here in this area. The famous Santa Monica "big blue" buses serve UCLA, Santa Monica, Palms, Venice, ISI, and LAX, and Culver City bus lines serve Culver City, Venice, ISI, and LAX, and LA Metro buses and trains can take you to downtown LA and beyond. Additionally, a free shuttle runs during work days between USC's main campus and ISI; this is especially convenient because many interns find lodging there. Furthermore, LAX is very close to ISI (10 minutes by bus) so air travel is convenient.

Q: Are international students eligible to apply?

A: Yes, we do take on international students (see past interns list).  For international students currently studying in the United States (F-1 holders), we will help you get an OPT or CPT status on top of your F-1, which is generally straighforward. CPT is largely preferred because it takes much shorter time to get approved but requires you to register for (at least) one unit in the summer. OPT usually takes 2-3 months to get approved, but you don't need to register any unit. For details about CPT/OPT, please consult your school's international student office. For international students currently studying outside the United States, we will help you get a J-1 visa. However, if you do not already have a social security number you should plan to come to the United States at least two weeks before you are to begin working in order to have enough time to obtain work authorization.

If we are forced to conduct the internship program remotely (e.g. due to COVID-19) you must still reside in the United States for the duration of the internship.

Q: I have other plans in the summer, so can I intern during Fall or Spring?

A: No, we only take summer interns (and they have to start within two weeks of our official start date).