-
-
Notifications
You must be signed in to change notification settings - Fork 32
Gradient descent optimizer should find optimal coefficient values
Formula, using for gradient descent:
ci = ci-1 - η * g
where:
- i - number of iteration
- c - vector of coefficients
- η - leraning rate
- g - gradient vector
Precalculated gradient vector, g, - (8, 8, 8)
Previously calculated coefficient vector, c’, - (0, 0, 0)
Learning rate, η, - 2.0
c1 = c1’ - η * g1 = 0 - 2 * 8 = -16
c2 = c2’ - η * g2 = 0 - 2 * 8 = -16
c3 = c3’ - η * g3 = 0 - 2 * 8 = -16
c = (-16, -16, -16)
Precalculated gradient vector, g, - (8, 8, 8)
Previously calculated coefficient vector, c’, - (-16, -16, -16)
Learning rate, η, - 2.0
c1 = c1’ - η * g1 = -16 - 2 * 8 = -32
c2 = c2’ - η * g2 = -16 - 2 * 8 = -32
c3 = c3’ - η * g3 = -16 - 2 * 8 = -32
c = (-32, -32, -32)
Precalculated gradient vector, g, - (8, 8, 8)
Previously calculated coefficient vector, c’, - (-32, -32, -32)
Learning rate, η, - 2.0
c1 = c1’ - η * g1 = -32 - 2 * 8 = -48
c2 = c2’ - η * g2 = -32 - 2 * 8 = -48
c3 = c3’ - η * g3 = -32 - 2 * 8 = -48
c = (-48, -48, -48)