You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, it seems there is an issue with signals example scripts repo, it does not work correctly - to be more specific, example_data_pipeline.py crashes, here are the logs
(fastai) synth@zeus:~/repos/numerai-signals(feature/initial-signal)$ python example_data_pipeline.py
/home/synth/mambaforge/envs/fastai/lib/python3.10/site-packages/opensignals/data/provider.py:224: FutureWarning: iteritems is deprecated and will be removed in a future version. Use .items instead.
for start, tickers in ticker_missing_grouped.iteritems():
1/1 [00:00<00:00, 11.09tickers/s]
100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 8.26tickers/s]
⠧ Generating featuresTraceback (most recent call last):
File "/home/synth/repos/numerai-signals/example_data_pipeline.py", line 56, in <module>
main(args.output_dir)
File "/home/synth/repos/numerai-signals/example_data_pipeline.py", line 29, in main
train, test, live, feature_names = yahoo.get_data(db_dir,
File "/home/synth/mambaforge/envs/fastai/lib/python3.10/site-packages/opensignals/data/provider.py", line 169, in get_data
ticker_data, feature_names_aux = features_generator.generate_features(ticker_data, feature_prefix)
File "/home/synth/mambaforge/envs/fastai/lib/python3.10/site-packages/opensignals/features.py", line 120, in generate_features
ticker_data[col] = date_groups[feature_prefix_name].transform(
File "/home/synth/mambaforge/envs/fastai/lib/python3.10/site-packages/pandas/core/groupby/generic.py", line 445, in transform
return self._transform(
File "/home/synth/mambaforge/envs/fastai/lib/python3.10/site-packages/pandas/core/groupby/groupby.py", line 1823, in _transform
return self._transform_general(func, *args, **kwargs)
File "/home/synth/mambaforge/envs/fastai/lib/python3.10/site-packages/pandas/core/groupby/generic.py", line 478, in _transform_general
res = func(group, *args, **kwargs)
File "/home/synth/mambaforge/envs/fastai/lib/python3.10/site-packages/opensignals/features.py", line 121, in <lambda>
lambda group: pd.qcut(group, 5, labels=False, duplicates='drop')
File "/home/synth/mambaforge/envs/fastai/lib/python3.10/site-packages/pandas/core/reshape/tile.py", line 377, in qcut
bins = np.quantile(x_np, quantiles)
File "<__array_function__ internals>", line 200, in quantile
File "/home/synth/mambaforge/envs/fastai/lib/python3.10/site-packages/numpy/lib/function_base.py", line 4461, in quantile
return _quantile_unchecked(
File "/home/synth/mambaforge/envs/fastai/lib/python3.10/site-packages/numpy/lib/function_base.py", line 4473, in _quantile_unchecked
return _ureduce(a,
File "/home/synth/mambaforge/envs/fastai/lib/python3.10/site-packages/numpy/lib/function_base.py", line 3752, in _ureduce
r = func(a, **kwargs)
File "/home/synth/mambaforge/envs/fastai/lib/python3.10/site-packages/numpy/lib/function_base.py", line 4639, in _quantile_ureduce_func
result = _quantile(arr,
File "/home/synth/mambaforge/envs/fastai/lib/python3.10/site-packages/numpy/lib/function_base.py", line 4745, in _quantile
take(arr, indices=-1, axis=DATA_AXIS)
File "<__array_function__ internals>", line 200, in take
File "/home/synth/mambaforge/envs/fastai/lib/python3.10/site-packages/numpy/core/fromnumeric.py", line 190, in take
return _wrapfunc(a, 'take', indices, axis=axis, out=out, mode=mode)
File "/home/synth/mambaforge/envs/fastai/lib/python3.10/site-packages/numpy/core/fromnumeric.py", line 57, in _wrapfunc
return bound(*args, **kwds)
IndexError: cannot do a non-empty take from an empty axes.
(fastai) synth@zeus:~/repos/numerai-signals(feature/initial-signal)$
The text was updated successfully, but these errors were encountered:
Hi, it seems there is an issue with signals example scripts repo, it does not work correctly - to be more specific,
example_data_pipeline.py
crashes, here are the logsThe text was updated successfully, but these errors were encountered: