Offered: Fall 2025 (current)
Text Representation: distributional semantics, vector semantics, co-occurrence theorem, static and dynamic word embeddings; Syntactic Parsing: dependency parsing, constituency parsing; Probabilistic Sequence Learning: Parts of speech tagging, named entity recognition, Markov models, Viterbi algorithm; Recurrent Neural Networks: background, history, uni- and bi-directional LSTMs and GRUs; Seq2seq Models and Attention Mechanism: translation, cross and self attention; Transformers: multi-headed attention, byte-pair encoding and other subword tokenizations, positional encoding, query-key-value analysis, BERT; Large Language Models: background, neural language models, transformer-based LLMs, state space models; Generative NLP: applications, ethical considerations; Ethical NLP: bias, fairness, explainability, energy considerations.
The core objectives of this course are to:
Teach students complex text representation techniques for real life applications
Familiarize them with syntactic structures and parsing techniques
Introduce them with the latest generative NLP techniques
Teach them ethical concerns in the field
Familiarize them with NLP related academic reading and writing techniques
1. To Be Added
| # | Description | Weight | Edit |
|---|