Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merge the latest loss functions from upstream #5

Open
sadamov opened this issue Jan 26, 2024 · 0 comments
Open

Merge the latest loss functions from upstream #5

sadamov opened this issue Jan 26, 2024 · 0 comments
Labels
enhancement New feature or request

Comments

@sadamov
Copy link
Collaborator

sadamov commented Jan 26, 2024

Joel mentioned that he added new loss functions and that the negative log likelihood performs better than mse/mae.

  1. Merge the latest commit from uptream into a feature_branch
  2. This will give you a new metrics.py file that contains many additional loss functions
  3. Start a training with loss = negative log likelihood for 40 epochs (validation every 20 epochs)
  4. Compare training loss with previous training loss curves on wandb
@sadamov sadamov added the enhancement New feature or request label Jan 26, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant