Skip to content

With awesome options like micrograd or tinygrad out there, why not write another small autodiff engine? ¯\_(ツ)_/¯

License

Notifications You must be signed in to change notification settings

ksanjeevan/fauxgrad

Repository files navigation

fauxgrad

Open In Colab


There's plenty of excellent (tinygrad) and minimalist (micrograd) built-from-scratch, deep learning frameworks out there, so the goal of fauxgrad is to sacrifice some of the full functionality, and focus on the general idea and building blocks for writing your own.

The walkthrough/tutorial can be found in this notebook.

Installation

pip install fauxgrad

Examples

Calculating some gradients:

from fauxgrad import Value
a = Value(2.3)
b = Value(-1)
c = (-a * b).log()
l = -(c.sigmoid() + b) + a
l.backward()
print('The derivative that we computed before, dl/da:', a.grad)
>>> 0.91

Plotting the backward pass graph:

from fauxgrad.utils import plot_graph
plot_graph(l) # green node is l, light blue nodes have no parents

About

With awesome options like micrograd or tinygrad out there, why not write another small autodiff engine? ¯\_(ツ)_/¯

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages