Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[TODO] support sample weights when training #256

Closed
L-M-Sherlock opened this issue Dec 8, 2024 · 3 comments · Fixed by #260
Closed

[TODO] support sample weights when training #256

L-M-Sherlock opened this issue Dec 8, 2024 · 3 comments · Fixed by #260

Comments

@L-M-Sherlock
Copy link
Member

L-M-Sherlock commented Dec 8, 2024

It will make these possible:

  • give more weights on post-lapse reviews to improve the accuracy on post-lapse stability.
  • give more weights on recent reviews (Time Decay) to fit the learner's current memory pattern and review habit.

fsrs-optimizer has supported it: open-spaced-repetition/fsrs-optimizer#152

@brishtibheja
Copy link

This is probably going to improve things for end-user but when we update our Anki and optimise, this should result in worse log-loss in which Anki will just retain the previous parameters instead of updating them. Wdyt about that?

@L-M-Sherlock
Copy link
Member Author

This is probably going to improve things for end-user but when we update our Anki and optimise, this should result in worse log-loss in which Anki will just retain the previous parameters instead of updating them. Wdyt about that?

Maybe we need apply the weights in the evaluation, too.

@Expertium
Copy link
Contributor

I've said this before - if we apply weights during both evaluation and training, then loss values will effectively become arbitrary since they depend on the weighting function.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants