We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hi,
I found an issue with xgboost example https://github.com/thesps/conifer/blob/master/examples/xgboost_to_hls.py
y_hls and y_xgb aren't close
y_hls
y_xgb
y_hls = expit(model.decision_function(X_test)) y_xgb = bst.predict(dtest) diff = y_xgb - y_hls print(diff[abs(diff)>0.05]) [-0.13502171 0.06955624 -0.1099674 -0.2427507 -0.14311438 -0.0606428 0.08703702 -0.054607 -0.41907781 -0.12813512 0.28282228 -0.21637464 0.31876776 0.26711339 -0.14989728 -0.05887845 -0.06809392 0.12303647 -0.08492118 -0.07751923 -0.05739652 -0.11599926 -0.14425865 -0.08459726 -0.12540119 -0.06227853 -0.27874367 -0.29141373 0.12563779 -0.22311496 -0.13287621 -0.17924546 -0.10041202]
As soon as output is normalized to 1, absolute error up to 0.31 seems to be too high for practical usage. Is it a known issue?
The text was updated successfully, but these errors were encountered:
Hi, these examples are not at all optimized in terms of the precision used, which can have an effect on numerical accuracy. Did you try any tuning?
Sorry, something went wrong.
Nope, just vanilla example out of the box
No branches or pull requests
Hi,
I found an issue with xgboost example
https://github.com/thesps/conifer/blob/master/examples/xgboost_to_hls.py
y_hls
andy_xgb
aren't closeAs soon as output is normalized to 1, absolute error up to 0.31 seems to be too high for practical usage.
Is it a known issue?
The text was updated successfully, but these errors were encountered: