Welcome to the TinyML Repository! This collection of notebooks provides insights and hands-on experience in deploying machine learning models on small, low-power devices, such as microcontrollers, sensors, and embedded systems. TinyML focuses on optimizing machine learning models to run efficiently on devices with limited computational resources.
TinyML refers to the deployment of machine learning models on small, energy-efficient devices. It enables real-time, on-device processing with minimal power consumption, making it ideal for battery-powered and edge computing applications. Models need to be optimized for devices with constraints in memory, processing power, and energy.
- Low Power Consumption: Optimized to run with minimal power, extending battery life.
- Limited Computational Resources: Simplified models fit within device constraints.
- Real-Time Inference: Models process data and make decisions in real-time.
- On-Device Processing: Models run directly on the device, reducing latency and improving privacy.
This repository contains a series of notebooks that explore various aspects of TinyML and modern machine learning techniques:
-
Introduction_to_PyTorch.ipynb
Introduction to PyTorch, covering basics like tensors, automatic differentiation, and model training. -
Model_Training.ipynb
Training machine learning models, setting up datasets, and optimizing model performance. -
Transformer.ipynb
Overview of the Transformer architecture and its use in NLP tasks with self-attention mechanisms. -
Network_Pruning.ipynb
Techniques for reducing neural network size while maintaining performance for TinyML. -
Quantization.ipynb
Model quantization to reduce weight precision, making models more efficient for low-power devices. -
Dataset_Distillation.ipynb
Creating smaller, efficient datasets that maintain performance for TinyML applications. -
Scaling_Laws.ipynb
How model size, data size, and compute resources impact performance, and trade-offs for TinyML. -
Reinforcement_Learning_with_Human_Feedback_RLHF.ipynb
Using human feedback in reinforcement learning to improve TinyML applications. -
Diffusion_Models.ipynb
Introduction to diffusion models and their potential in generative tasks like image synthesis. -
Diffusion_Advanced.ipynb
Advanced concepts in diffusion models, including optimization and real-world applications. -
Mixture_of_Experts_MoE.ipynb
Exploring the Mixture of Experts (MoE) model, which uses multiple "expert" sub-models for scalability. -
LoRA.ipynb
Low-Rank Adaptation (LoRA) technique for fine-tuning pre-trained models with fewer parameters.
-
Evolutionary Algorithm
A deep dive into evolutionary algorithms, used for optimizing machine learning models by mimicking natural selection. -
Binary_Convolution.ipynb
Exploring binary convolutional networks, which reduce model size by using binary weights. -
Logical_XNOR_Network.ipynb
A notebook discussing XNOR networks, which use logical operations for efficient neural network design. -
Symmetric_&_Assymetric_Quantization.ipynb
An exploration of symmetric and asymmetric quantization techniques for optimizing model performance in TinyML.
To use the notebooks in this repository, you will need to install the following dependencies:
pip install torch torchvision torchaudio
pip install numpy pandas matplotlib
pip install sklearn
pip install tqdm
Check individual notebooks for any additional dependencies.
Clone this repository to your local machine:
git clone https://github.com/Razaimam45/TinyML-Notebooks-by-Raza.git
Navigate to the directory and open the notebooks using Jupyter or any compatible viewer:
cd TinyML-Notebooks-by-Raza
jupyter notebook
Feel free to contribute by opening issues, submitting pull requests, or suggesting improvements. All contributions are welcome!
This repository is licensed under the MIT License. See the LICENSE file for more information.
We would like to acknowledge the TinyML and LLMs (ML819) course for its valuable collection by Zhiqian Shen at MBZUAI which led to the creation of this repository.
Enjoy exploring TinyML and happy learning!