Skip to main content
Master of Science in Computational Linguistics

Courses & Degree Requirements

Degree Requirements

To earn their degree, students in the UW Master of Science in Computational Linguistics program complete nine courses and a master’s project, for a total of 43 credits. For the master's project, you can choose either a thesis or a six- to 10-week internship.

Six of the courses are required and three are electives. Of the required courses, two are linguistics courses and the other four focus on natural language processing.

Required Courses

LING 550: Introduction to Linguistic Phonetics

Credits: 5

This course provides an introduction to the articulatory and acoustic correlates of phonological features. We'll cover a variety of issues, including the mapping of dynamic events to static representations, phonetic evidence for phonological description, universal constraints on phonological structure and implications of psychological speech-sound categorization for phonological theory.

Prerequisites: LING 200: Introduction to Linguistic Thought or LING 400: Survey of Linguistic Method & Theory

LING 566: Introduction to Syntax for Computational Linguistics

Credits: 3

This course provides an introduction to syntactic analysis and concepts, including part of speech types; constituent structure; the syntax-semantics interface; and phenomena such as complementation, raising, control, and passive and long-distance dependencies. We'll place emphasis on formal, precise encoding of linguistic hypotheses and designing grammars so they can be scaled up for practical applications. You'll progressively build a consistent grammar for a fragment of English and will work with problem sets that introduce data and phenomena from other languages.

Prerequisites: LING 200: Introduction to Linguistic Thought or LING 400: Survey of Linguistic Method & Theory

LING 570: Shallow Processing Techniques for Natural Language Processing

Credits: 4

This course covers techniques and algorithms for associating relatively surface-level structures and information with natural language corpora. We'll cover a number of topics, including tokenization/word segmentation, part-of-speech tagging, morphological analysis, named-entity recognition, chunk parsing and word-sense disambiguation. You'll also be introduced to linguistic resources that can be leveraged for these tasks, such as the Penn Treebank and WordNet.

Prerequisites:

LING 571: Deep Processing Techniques for Natural Language Processing

Credits: 4

This course covers algorithms for associating deep or elaborated linguistic structures with naturally occurring linguistic data, looking at syntax, semantics and discourse. It also explores algorithms that produce natural language strings from input semantic representations.

Prerequisites:

LING 572: Advanced Statistical Methods in Natural Language Processing

Credits: 4

This course covers several important machine learning algorithms for natural language processing, including decision tree, kNN, Naive Bayes, transformation-based learning, support vector machine, maximum entropy and conditional random field. You'll implement many of the algorithms and apply these algorithms to selected NLP tasks.

Prerequisites: LING 570

LING 573: Language Processing Systems & Applications

Credits: 4

This course looks at building coherent NLP systems designed to tackle practical applications. You'll work in groups to build a working end-to-end system for some practical application. The specific application addressed varies by year, but examples include: machine (aided) translation, speech interfaces, information retrieval/extraction, natural language query systems, dialogue systems, augmentative and alternative communication, computer-assisted language learning, language documentation/linguistic hypothesis testing, spelling and grammar checking, optical character recognition, handwriting recognition and software localization.

Prerequisites: LING 570, LING 571, LING 572

Elective Courses

  • One 400- or 500-level linguistics course, such as phonology, morphology, syntax, semantics or sociolinguistics
  • One of the elective courses in computational linguistics listed below 
  • One additional course in computational linguistics or a related area

The elective course LING 575: Topics in Computational Linguistics is offered four or five times a year, with new topics offered annually. This course is taught by UW Department of Linguistics faculty as well as guest instructors from other departments and experts from the industry.

Below is a list of topics that have been covered in LING 575 in recent years. Students may also take LING 567: Knowledge Engineering for Deep Natural Language Processing; LING: 574 Deep Learning for Natural Language Processing; or select courses in related fields, such as EE 516: Computer Speech Processing, as electives.

Note: The prerequisites for LING 575 vary by course topic, but typically are LING 570 or LING 571.

LING 575: Summarization

Instructor: Fei Xia
Credits: 3

Summarization is a big research area, and there have been a tremendous amount of work and many shared tasks in this area in the past two decades. In this seminar, we'll spend the first few weeks covering the key components of a summarization system, including content selection, information ordering, and surface realization. Then we'll move onto several varieties of summarization tasks, such as multi-document, guided, query-oriented summarization. About 70% of class time will be lectures and the remaining 30% will be student presentations. You can choose to form teams or work independently. You'll give presentations, implement some components of summarization systems and write final reports.

LING 575: Meaning Making with Artificial Agents

Instructor: Emily M. Bender 
Credits: 3

Humans make sense of language in context, bringing to bear their own understanding of the world including their model of their interlocutor’s understanding of the world. In this course, we will explore risks that arise when we as humans bring this sense-making capacity to interactions with artificial interlocutors. What happens in conversations where one party has no (or extremely limited) access to meaning and all of the interpretative work rests with the other? How can linguistic pragmatics help us reason about these situations? What design choices are available in language technology to mitigate these risks?

LING 575: Speech Technology for Endangered Languages

Instructor: Gina-Anne Levow 
Credits: 3

The course will cover the theory and practice of speech technology and its application to endangered languages. The course will have readings and lectures on general techniques and issues in speech technology and will use publicly available tools and toolkits to investigate the application of speech technology to endangered language data.

LING 575: Analyzing Neural Language Models

Instructor: Shane Steinert-Threlkeld 
Credits: 3

Two recent trends in NLP — the application of deep neural networks and the use of transfer learning — have resulted in many models that achieve high performance on important tasks but whose behavior on those tasks is difficult to interpret. In this seminar, we will look at methods inspired by linguistics and cognitive science for analyzing what large neural language models have in fact learned: diagnostic/probing classifiers, adversarial test sets, and artificial languages, among others. Particular attention will be paid to probing these models’ semantic knowledge, which has received comparably little attention compared to their syntactic knowledge. Students will acquire relevant skills and (in small groups) design and execute a linguistically-informed analysis experiment, resulting in a report in the form of a publishable conference paper.

LING 575: NLP and the Real World

Instructor: Fei Xia 
Credits: 3

In the past decade, neural network approach has dominated the NLP field and many systems claim to achieve or surpass human performance. Meanwhile, there has been concern raised by some ML/NLP researchers about whether the ML/NLP field is chasing the wrong goal. One issue is that the field is dominated by “leaderboard chasing” — that is, many researchers focus on building models that make incremental improvements and getting the best results on benchmark data sets; in contrast, the work on applying NLP techniques to the real world is marginalized. Another issue is that neural network systems can be hard to interpret. In this seminar, we will first look into those issues and discuss the consequences of marginalizing applications. We will then go over several important areas, including large LM, green NLP and reproducibility, ethics in NLP and explainability for NLP.  

LING 575: Societal Impacts of Language Technology

Instructor: Emily M. Bender 
Credits: 3

The goal of this course is to better understand the ethical considerations that arise in the deployment of NLP technology, including how to identify people likely to be impacted by the use of the technology (direct and indirect stakeholders), what kinds of risks the technology poses, and how to design systems in ways that better support stakeholder values. Through discussions of readings in the growing research literature on fairness, accountability, transparency and ethics (FATE) in NLP and allied fields, and value sensitive design, we will seek to answer the following questions:

  • What can go wrong when we use NLP systems, in terms of specific harms to people?
  • How can we fix/prevent/mitigate those harms?
  • What are our responsibilities as NLP researchers and developers in this regard?

Orientation and Industry Talks

Beyond your coursework, you're also expected to attend an orientation at the start of the program and periodic industry talks.

The two-day orientation is held on the UW Seattle campus. It's designed to give you an overview of the program and contextualize the coursework.

The talks are a year-long series by professional computational linguists on product- and research-focused topics. The orientation and talks are broadcast online if you cannot attend in person.