A little experiment with the simplest possible network, visualizing gradiant descent
I got inspired by James Loy's NN tutorial and adapted it into an even simpler network with only one node that performs a simple linear regression. The point for me was to visualize the loss surface corresponding to weight and bias of the single node and plotting the models learning steps.