Skip to main content
 

COMP53815: Natural Language Processing

Type Tied
Level 5
Credits 15
Availability Available in 2025/2026
Module Cap
Location Durham
Department Computer Science

Prerequisites

  • None

Corequisites

  • None

Excluded Combinations of Modules

  • None

Aims

  • To introduce the students to computational linguistics.
  • To introduce the students to statistical and neural language models.
  • To help students gain experience in using advanced techniques to solve natural language tasks such as text parsing, understanding, classification, translation, and generation.

Content

  • Text pre-processing
  • Feature extraction
  • Statistical language models
  • Neural language models
  • Neural word embeddings
  • Recurrent Neural Networks (RNNs) for NLP tasks
  • Advanced variations of RNNs
  • Convolutional Neural Networks (CNNs) for NLP tasks
  • Sequence-to-sequence architectures
  • Attention and self-attention mechanisms
  • Transformers
  • Pretrained transformer models, e.g., BERT and GPT
  • Multitask learning
  • NLP ethics and fairness

Learning Outcomes

Subject-specific Knowledge:

  • By the end of this module, students should be able to demonstrate:
  • an understanding of the fundamental concepts of language models.
  • an understanding of the mathematical basis of various deep-learning-based language models.
  • an understanding of the learning algorithms behind major NLP use cases e.g. machine translation, multi-task Learning, text generation and classification.
  • An understanding of the embedded bias in popular language models.

Subject-specific Skills:

  • By the end of this module, students should be able to demonstrate:
  • an ability to conduct independent research in the field of NLP.
  • an ability to handle textual data and extract representative features.
  • an ability to use state-of-the-art NLP techniques and models to solve real-world tasks.

Key Skills:

  • By the end of this module, students should be able to demonstrate:
  • an ability to design end-to-end solutions for real-world problems with textual input using state-of-the-art NLP techniques.
  • an ability to make informed decisions regarding neural architectures for NLP tasks.
  • awareness of the Language Models and their biases.

Modes of Teaching, Learning and Assessment and how these contribute to the learning outcomes of the module

  • Lectures enable the students to learn new material relevant to NLP concepts, word embeddings, language models, as well as their applications.
  • Computer classes enable the students to put into practice learning from lectures and strengthen their understanding through application.
  • The summative assignment assesses the application of methods and techniques and assesses the understanding of core concepts. It consists of a coding exercise with accompanying report.

Teaching Methods and Learning Hours

ActivityNumberFrequencyDurationTotalMonitored
Lectures81 per week2 hours16 
Lectures81 per week1 hour8 
Computer Classes41 every other week (weeks 2, 4, 6, and 8)2 hours8 
Preparation and Reading118 
Total150 

Summative Assessment

Component: CourseworkComponent Weighting: 100%
ElementLength / DurationElement WeightingResit Opportunity
Assignment100

Formative Assessment

Via computer classes.

More information

If you have a question about Durham's modular degree programmes, please visit our Help page. If you have a question about modular programmes that is not covered by the Help page, or a query about the on-line Postgraduate Module Handbook, please contact us.

Prospective Students: If you have a query about a specific module or degree programme, please Ask Us.

Current Students: Please contact your department.