-
Notifications
You must be signed in to change notification settings - Fork 894
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Doc/Adding basic usage example in each model docstring #1956
Conversation
Codecov ReportPatch has no changes to coverable lines. ❗ Your organization needs to install the Codecov GitHub app to enable full functionality. 📢 Thoughts on this report? Let us know!. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Very nice, thanks a lot @madtoinou 🚀
Added a couple of suggestions, mainly to follow a common format
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Some more points (this time with N for New ;) ):
- N1: I'd try to make it more obvious that the usage of future covariates is optional, not the encoding itselve (here and everywhere else when covariates are used). E.g.
>>> # optionally, encode the holidays as a future covariates
to
>>> # optionally, use some future covariates; e.g. the value of the month encoded as a sine and cosine series
- N2: the air passengers series has a monthly frequency. Adding holidays as covariates is not the best choice for this frequency (here and in the other examples).
Maybe we could use the value of the month or something? E.g.
from darts.utils.timeseries_generation import datetime_attribute_timeseries
future_cov = datetime_attribute_timeseries(series, "month", cyclic=True, add_length=6)
- N3 The indentation should 4 whitespaces and the last ")" should be at 0 indentation
E.g.
# right now it looks like this:
>>> model = BlockRNNModel(
>>> input_chunk_length=12,
>>> output_chunk_length= 6,
>>> n_rnn_layers=2,
>>> n_epochs=50,
>>> )
# it should look like this
>>> model = BlockRNNModel(
>>> input_chunk_length=12,
>>> output_chunk_length=6,
>>> n_rnn_layers=2,
>>> n_epochs=50,
>>> )
- N4 the predicted values of the torch models are mostly really bad. We should explain that those are simple usage examples and for better performance user should transform the input data, more epochs, validation set, etc.. Ideally we could reference some notebook or something
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We have missing models:
sf_auto_arima.py
sf_auto_ces.py
sf_auto_ets.py
sf_auto_theta.py
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was already simple example for these models, I did not change them but I could so that everything is homogeneous.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks great now, thanks @madtoinou
I updated the statsforecast model examples and some leftover lambda reference in some models' add_encoders
descriptions
Fixes #690.
Summary
Adding short examples in each model docstring, maximizing the covariates usage when supported and pointing out some interesting hyper-parameters when relevant. Let me know if I should extend it to additional methods.
Other Information
Fixing an inconsistency for
Prophet
; this model supports future covariates but they were not stored during the call tofit()
, forcing the user to provide them again at inference.