You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
From my experiments it seems like the sign for the Ranger is inverted. All other optimizers (including Ranger21) has steps in the opposite direction of Ranger.
Note that I'm testing context-free step directions/magnitudes using a 'perfect' gradient (scaled by 4), so if Ranger somehow reverts course when gradients from different directions are accumulated that would be missed from my test.
Hyperparameters: {'betas': (0.003344506587403595, 0.9685357345548955), 'lr': 0.4616639698903086} (found through hyperparameter search, also done for the other optimizers) and evaluated on the Ackley (dim=2) function.
(I didn't want to create a PR before discussing if this might be intended)
To Reproduce
OS : Linux
PyTorch version : 2
Python version : 3.11
Log
Ranger:
For comparison SGD:
The text was updated successfully, but these errors were encountered:
Describe the bug
From my experiments it seems like the sign for the Ranger is inverted. All other optimizers (including Ranger21) has steps in the opposite direction of Ranger.
Note that I'm testing context-free step directions/magnitudes using a 'perfect' gradient (scaled by 4), so if Ranger somehow reverts course when gradients from different directions are accumulated that would be missed from my test.
Hyperparameters:
{'betas': (0.003344506587403595, 0.9685357345548955), 'lr': 0.4616639698903086}
(found through hyperparameter search, also done for the other optimizers) and evaluated on the Ackley (dim=2) function.(I didn't want to create a PR before discussing if this might be intended)
To Reproduce
Log
Ranger:
For comparison SGD:
The text was updated successfully, but these errors were encountered: