Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

JOSS review: questions regarding optimization #243

Open
fabian-sp opened this issue Jul 8, 2022 · 0 comments
Open

JOSS review: questions regarding optimization #243

fabian-sp opened this issue Jul 8, 2022 · 0 comments

Comments

@fabian-sp
Copy link

fabian-sp commented Jul 8, 2022

This is related to the review of FunFact for JOSS (see openjournals/joss-reviews#4502)

I have a few questions regarding your optimization procedures.

  • If I understand correctly, i could use any torch.optim optimizer as opt argument, because they possess a step method, correct? As I am not so familiar with JAX, is this also true for JAX? Maybe you could provide a short example for this in the docs (or at least I couldn't find it)?
  • Regarding the point above, I was wondering why you reimplemented Adam and RMSProp in https://github.com/yhtang/FunFact/blob/4e5694f7c9881223fcb41fcb21e49007586aa779/funfact/optim.py. They are included in Pytorch, so reimplementing them seems a bit counterintuitive to me. Even though these algorithms are pretty simple, this is maybe an unnecessary source of errors. Is there any reason for using by default Adam from torch or some JAX optimizers depending on the active backend?

Thank you in advance for your help!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant