-
-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Initialising optimisers with constant parameters. #61
Comments
So I think this works using the parameter names ParameterSchedulers.jl/src/scheduler.jl Line 48 in 7b3c081
to return constructor(;kwargs...) ...
|
I'd open a PR for this but am unsure whether this is not breaking stuff. |
This is a bug and can be fixed in a patch.
Yeah we can replace numbers with This is also not breaking and can be released in a patch. Feel free to submit a PR! |
I cannot find a way to set constant optimisers parameters together with Scheduler.
For example, I would like to setup a
AdamW
optimiser with exponentially decaying learning rate but also prescribe a constant decay of 1e-2.I would expect this to work but obtain an error
I also tried wrapping
λ=ParameterSchedulers.Constant(1e-3))
resulting in another error
Calling
λ
decay
or leaving out the arguments names completely also did not help.The text was updated successfully, but these errors were encountered: