Skip to content

yonas-g/Transformer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Transformers

Minimal PyTorch implementation of the Transformers architecture from Vaswani et al.'s 2017 paper, Attention Is All You Need. This repository serves as a deep dive into understanding key architectures and their nuances, including architecture design and training techniques.

Components

Project Structure

  • Tokenizer.py
  • Modules/
    • Attention.py
      • MultiHeadAttentionBase
      • MultiHeadSelfAttention
      • MultiHeadCrossAttention
    • AddNorm.py
    • MLP.py
    • Encoder.py
      • EncoderLayer
      • Encoder
    • Decoder.py
      • DecoderLayer
      • Decoder
    • Transformer.py
      • Transformer
  • Config/
    • Config.py
  • BERT/
    • BERT.py
      • Bert

About

a minimal transformer implementation

Resources

Stars

Watchers

Forks

Languages