Skip to content

Sangwan70/Building-an-LLM-From-Scratch

Repository files navigation

Building a Large Language Model From Scratch

This repository contains the code for developing, pretraining, and finetuning a GPT-like LLM.

You'll learn and understand how large language models (LLMs) work from the inside out by coding them from the ground up, step by step. I'll guide you through creating your own LLM, explaining each stage with clear text, diagrams, and examples.

The method described for training and developing your own small-but-functional model for educational purposes mirrors the approach used in creating large-scale foundational models such as those behind ChatGPT. In addition, this repository includes code for loading the weights of larger pretrained models for finetuning.

git clone --depth 1 https://github.com/Sangwan70/Building-an-LLM-From-Scratch.git

Table of Contents

Tip

If you're seeking guidance on installing Python and Python packages and setting up your code environment, I suggest reading the README.md file located in the setup directory.

part Title Main Code (for Quick Access) All Code + Supplementary
Setup recommendations - -
Part 1: Working with Text Data - Part 1: Working with Text
- dataloader.ipynb (summary)
- additional_examples.ipynb
./part_1
Part 2: Coding Attention Mechanisms - Part 2: Coding Attention Mechanisms
- multihead-attention.ipynb (summary)
- additional_examples.ipynb
./part_2
Part 3: Implementing a GPT Model from Scratch - Part 3: Implementing a GPT model from Scratch
- gpt.py (summary)
- additional_examples.ipynb
./part_3
Part 4: Pretraining on Unlabeled Data - Part 4: Pretraining on Unlabeled Data
- gpt_train.py (summary)
- gpt_generate.py (summary)
- additional_examples.ipynb
./part_4
Part 5: Finetuning for Text Classification - Part 5: Finetuning for Text Classification
- gpt_class_finetune.py
- additional_examples.ipynb
./part_5
Part 6: Finetuning to Follow Instructions - Part 6: Finetuning To Follow Instructions
- gpt_instruction_finetuning.py (summary)
- ollama_evaluate.py (summary)
- example-solutions.ipynb
./part_6
Appendix A: Introduction to PyTorch - code-part1.ipynb
- code-part2.ipynb
- DDP-script.py
- additional_examples.ipynb
./appendix-A
Appendix B: Adding Bells and Whistles to the Training Loop - appendix-B.ipynb ./appendix-B
Appendix C: Parameter-efficient Finetuning with LoRA - appendix-C.ipynb ./appendix-C

Hardware Requirements

The code in the main parts of this course is designed to run on conventional laptops within a reasonable timeframe and does not require specialized hardware. Additionally, the code automatically utilizes GPUs if they are available. (Please see the setup doc for additional recommendations.)

Additional Material

Several folders contain additional materials for interested readers:

Questions, Feedback, and Contributing to This Repository

I welcome all sorts of feedback via GitHub Discussions. Likewise, if you have any questions or just want to bounce ideas off others, please don't hesitate to post these in the forum as well.

 

Citation

BibTeX entry:

  author       = {Ram N Sangwan},
  title        = {Building An LLM From Scratch}
  github       = {https://github.com/Sangwan70/Building-an-LLM-From-Scratch}
}