Skip to content

Latest commit

 

History

History
52 lines (37 loc) · 1.61 KB

README.md

File metadata and controls

52 lines (37 loc) · 1.61 KB

fauxgrad

Open In Colab


There's plenty of excellent (tinygrad) and minimalist (micrograd) built-from-scratch, deep learning frameworks out there, so the goal of fauxgrad is to sacrifice some of the full functionality, and focus on the general idea and building blocks for writing your own.

The walkthrough/tutorial can be found in this notebook.

Installation

pip install fauxgrad

Examples

Calculating some gradients:

from fauxgrad import Value
a = Value(2.3)
b = Value(-1)
c = (-a * b).log()
l = -(c.sigmoid() + b) + a
l.backward()
print('The derivative that we computed before, dl/da:', a.grad)
>>> 0.91

Plotting the backward pass graph:

from fauxgrad.utils import plot_graph
plot_graph(l) # green node is l, light blue nodes have no parents