Lab 0 - Understanding LLMs + Labs Overview
Lab 1 - Working with Text
Lab 2 - Attention Mechanisms
Lab 3 - Implementing a GPT model from Scratch
Lab 4 - Pretraining on Unlabeled Data
Lab 5 - Finetuning for Text Classification
Lab 6 - Finetuning To Follow Instructions
Useful resources:
- The Annotated Transformer: line-by-line implementation of the Attention Is All You Need paper.
- The Illustrated Transformer
- Let's build GPT - Andrej Karpathy
- Stanford CS25: V2 I Introduction to Transformers
This lab is based on the https://github.com/rasbt/LLMs-from-scratch book/repo.