Single Course Enrollment
Interested in taking a single course, or even several courses, before you commit to the full Master of Science in Computational Linguistics? The courses below are open for single course enrollment through UW Continuum College.
LING 473: Basics for Computational Linguistics
Credits: 3
Computational linguistics builds on the theory and practice of multiple fields (linguistics, computer science and statistics) to design computer applications that involve the automatic processing of natural language speech or text by machines. This course is intended to reinforce the most important skills from contributing disciplines to prepare such students for further study in computational linguistics. We'll cover the following topics: UNIX and server cluster usage; probability and statistics (random variables and random vectors; conditional, joint and marginal probabilities; the chain rule; Bayes' rule; independence and conditional dependence); formal grammars and languages (Chomsky hierarchy, regular expressions and regular languages, context-free grammar and other grammar formalisms); finite-state automata and transducers; and algorithms and data structures.
This course is required only for the Certificate in Natural Language Technology. It is not required for the Master of Science in Computational Linguistics.
LING 550: Introduction to Linguistic Phonetics
Credits: 5
This course provides an introduction to the articulatory and acoustic correlates of phonological features. Issues covered include the mapping of dynamic events to static representations, phonetic evidence for phonological description, universal constraints on phonological structure and implications of psychological speech-sound categorization for phonological theory.
Prerequisites: LING 200: Introduction to Linguistic Thought or LING 400: Survey of Linguistic Method & Theory
LING 566: Introduction to Syntax for Computational Linguistics
Credits: 3
This course provides an introduction to syntactic analysis and concepts, including part of speech types, constituent structure, the syntax-semantics interface, and phenomena such as complementation, raising, control, and passive and long-distance dependencies. Emphasis will be placed on formal, precise encoding of linguistic hypotheses and designing grammars so they can be scaled up for practical applications. Students will progressively build a consistent grammar for a fragment of English and will work with problem sets that introduce data and phenomena from other languages.
Prerequisites: LING 200: Introduction to Linguistic Thought or LING 400: Survey of Linguistic Method & Theory
LING 570: Shallow Processing Techniques for Natural Language Processing
Credits: 4
This course covers techniques and algorithms for associating relatively surface-level structures and information with natural language corpora. Topics covered include tokenization/word segmentation, part-of-speech tagging, morphological analysis, named-entity recognition, chunk parsing and word-sense disambiguation. Students will also be introduced to linguistic resources that can be leveraged for these tasks, such as the Penn Treebank and WordNet.
Prerequisites:
- CSE 373: Data Structures & Algorithms or equivalent
- MATH/STAT 394: Probability I or equivalent
- Basic knowledge of formal grammars, formal languages, finite state automata
- Programming experience in Perl, C, C++, Java or Python
- Experience with basic UNIX/Linux commands
LING 571: Deep Processing Techniques for Natural Language Processing
Credits: 4
This course covers algorithms for associating deep or elaborated linguistic structures with naturally occurring linguistic data, looking at syntax, semantics and discourse. It also explores algorithms that produce natural language strings from input semantic representations.
Prerequisites:
- CSE 373: Data Structures & Algorithms or equivalent
- MATH/STAT 394: Probability I or equivalent
- Basic knowledge of formal grammars, formal languages, finite state automata
- Programming experience in Perl, C, C++, Java or Python
- Experience with basic UNIX/Linux commands
LING 572: Advanced Statistical Methods in Natural Language Processing
Credits: 4
This course covers several important machine learning algorithms for natural language processing, including decision tree, kNN, Naive Bayes, transformation-based learning, support vector machine, maximum entropy and conditional random field. You'll implement many of the algorithms and apply these algorithms to selected NLP tasks.
Prerequisites: LING 570
LING 573: Natural Language Processing Systems & Applications
Credits: 4
This course looks at building coherent NLP systems designed to tackle practical applications. You'll work in groups to build a working end-to-end system for some practical application. The specific application addressed varies by year, but examples include: machine (aided) translation, speech interfaces, information retrieval/extraction, natural language query systems, dialogue systems, augmentative and alternative communication, computer-assisted language learning, language documentation/linguistic hypothesis testing, spelling and grammar checking, optical character recognition, handwriting recognition and software localization.
How to Enroll
Because some courses require prior knowledge and experience in the field, those interested in taking an individual course need to submit the following materials at least two weeks before classes begin:
- A completed application for single course enrollment
- A resume listing your work-related experience and a letter of application (250-word maximum) describing your relevant knowledge and experience and how it was acquired
- A copy of your transcript from all relevant institutions (unofficial transcripts are acceptable)
Note that you can’t apply courses taken through single course enrollment toward a degree unless you’ve secured graduate nonmatriculated status first (see below).
For students meeting the prerequisites, single course enrollment is granted on a space-available basis only. Space in online and in-person sections is determined separately. UW Continuum College reserves the right to cancel classes with low enrollment.
Graduate Nonmatriculated Status
Graduate nonmatriculated, or GNM, is a student status at the UW designed to provide access to graduate courses for qualified post-bachelor’s degree students who are not currently seeking a graduate degree, but who may later want to apply those courses toward a degree. Although GNM status does not guarantee admission to the Master of Science in Computational Linguistics, you may apply up to 12 course credits toward degree requirements if you are later admitted. Learn more about GNM status.
To apply to be a GNM student, you must submit an online application through the Graduate School site during the regular program application period. The deadline for applying for GNM status is the end of the first week of classes. You should wait until you are approved for single course enrollment before applying for GNM status, and if accepted, you must enroll in a class that quarter to maintain your status.