Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Listing 3.5: weights are not reinitialized, reg_lambda is ignored? #14

Open
mehaase opened this issue Jul 29, 2021 · 0 comments
Open

Listing 3.5: weights are not reinitialized, reg_lambda is ignored? #14

mehaase opened this issue Jul 29, 2021 · 0 comments

Comments

@mehaase
Copy link

mehaase commented Jul 29, 2021

In listing 3.5, I think the weights are not reinitialized before each run. If I add a line to print out model weights, we can see that the model weights are non-zero after the first time through the loop.

sess = tf.Session()
init = tf.global_variables_initializer()
sess.run(init)

for reg_lambda in np.linspace(0, 1, 100):
    w_val = sess.run(w); print(w_val) # <----------- ADDED THIS -------------
    for epoch in range(training_epochs):
        sess.run(train_op, feed_dict={X:x_train, Y:y_train})
    final_cost = sess.run(cost, feed_dict={X:x_test, Y:y_test})
    print('reg lambda', reg_lambda)
    print('final cost', final_cost)

sess.close()

This prints out:

[0. 0. 0. 0. 0. 0. 0. 0. 0.]
reg lambda 0.0
final cost 0.081058994
[ 2.5665667e-03  9.3173017e-05  1.7256942e-03 -1.4493844e-05
  1.2718090e-03 -5.8588204e-05  1.0161836e-03 -7.8277393e-05
  8.5132103e-04]
reg lambda 0.010101010101010102
final cost 0.0801239
[ 5.0976453e-03  1.8751422e-04  3.4367563e-03 -2.7644883e-05
  2.5339315e-03 -1.1590221e-04  2.0250683e-03 -1.5540000e-04
  1.6967654e-03]
reg lambda 0.020202020202020204
final cost 0.07920629
[ 7.5936671e-03  2.8300245e-04  5.1333504e-03 -3.9473085e-05
  3.7864731e-03 -1.7195975e-04  3.0267318e-03 -2.3138340e-04
  2.5363949e-03]
reg lambda 0.030303030303030304
final cost 0.0783058
[ 1.0055063e-02  3.7961689e-04  6.8156384e-03 -4.9998103e-05
  5.0295377e-03 -2.2677830e-04  4.0212511e-03 -3.0624296e-04
  3.3702708e-03]
...SNIP...

I think the correct behavior is to reinitialize the weights for each value of lambda. I also captured the cost for each lambda.

lambdas = []; costs = []                   # <-------- ADD THIS --------

for reg_lambda in np.linspace(0, 1, 100):
    sess.run(init)                         # <-------- ADD THIS --------
    for epoch in range(training_epochs):
        sess.run(train_op, feed_dict={X:x_train, Y:y_train})
    final_cost = sess.run(cost, feed_dict={X:x_test, Y:y_test})
    print('reg lambda', reg_lambda)
    print('final cost', final_cost)
    lambdas.append(reg_lambda); costs.append(final_cost) # <-------- ADD THIS --------

sess.close()

Now if plot lambdas vs cost, it's a horizontal line.

image

It seems that changing reg_lambda is having no effect, but that's not easy to see when the weights are reused.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant