Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve dense layer #27

Open
2 of 7 tasks
theolepage opened this issue Dec 28, 2020 · 2 comments
Open
2 of 7 tasks

Improve dense layer #27

theolepage opened this issue Dec 28, 2020 · 2 comments
Assignees
Labels
feature New feature refactor Refactoring

Comments

@theolepage
Copy link
Owner

theolepage commented Dec 28, 2020

  • Batch training with mini-batch

  • Implement cross entropy cost function
  • Cost/loss average over batch for progress display?

  • Regularization

  • Better initialization of weights and biases
  • Learning rate decay (optimizers, custom function)

  • Adam optimizer
@theolepage theolepage self-assigned this Dec 30, 2020
@theolepage theolepage added feature New feature refactor Refactoring labels Jan 1, 2021
@Cc618
Copy link

Cc618 commented Feb 4, 2021

Hi !

For section "Better initialization of weights and biases", can I implement Xavier initialization with an initializer class ?

@theolepage
Copy link
Owner Author

theolepage commented Feb 6, 2021

Hey Celian.
Yes sure, feel free to create a pull request based from develop as this feature was planned.
Please note that the project is being fully refactored at the moment and I am not sure I will have much time to dedicate to the development.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature New feature refactor Refactoring
Projects
None yet
Development

No branches or pull requests

2 participants