-
Notifications
You must be signed in to change notification settings - Fork 237
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] cast(9.95 as decimal(3,1)), actual: 9.9, expected: 10.0 #10809
Comments
Scope: make casting of floats to decimals feature flag off by default and update documentation accordingly with this example. Ref: spark.rapids.sql.castFloatToDecimal.enabled. Check if supported operators in tools needs to be updated. |
Filed a cudf issue: rapidsai/cudf#15862 |
Also take a look at the discussion around #9682 (comment) and above. 9.95 is not representable as double $ jshell
| Welcome to JShell -- Version 21.0.2
| For an introduction type: /help intro
jshell> new BigDecimal(9.95)
$8 ==> 9.949999999999999289457264239899814128875732421875
jshell> new BigDecimal(9.949999999999999289457264239899814128875732421875)
$9 ==> 9.949999999999999289457264239899814128875732421875
jshell> new BigDecimal(9.95).setScale(1, BigDecimal.ROUND_HALF_UP)
$11 ==> 9.9
jshell> new BigDecimal(String.valueOf(9.95)).setScale(1, BigDecimal.ROUND_HALF_UP)
$1 ==> 10.0
jshell> new BigDecimal("9.95").setScale(1, BigDecimal.ROUND_HALF_UP)
$12 ==> 10.0 |
Okay after reading through the issue #9682 then I realize that this issue is also one instance of it. |
I tried the float => string => decimal path (it is very easy to implement in plugin), it can pass the Spark UT, but still some differences from the known limits of ryu float to string and a very edge cases in string to decimal #10890. I will post a pr and share some results for review next week, but not sure if the diffs are acceptable or original way can match it better. |
That sounds good. I've also found a way to implement in C++ which is very efficient but not sure if it will pass integration test. I'll verify that and will post a PR too. If having a chance, please list the related tests that I can run to verify. |
@thirtiseven Please test #10917 to see if you have any test fails. On my local env, all the unit tests and integration tests passed but I'm not sure if I missed anything. |
Hi @ttnghia, rapidsai/cudf#15905 has been merged, is it ready to fix this issue by using this new way? |
No that is not what we want. We may need something from pmattione-nvidia/cudf#2 which will be added into NVIDIA/spark-rapids-jni#2078, which is what we actually need. I will work on it soon this week. |
Hi @ttnghia, can we close this issue because your PR has fixed it? |
Thanks @GaryShen2008. The fix in JNI will be picked by spark-rapids in #10917 thus this issue will be closed by that PR. |
Repro
~/dist/spark-3.3.0-bin-hadoop3/bin/spark-shell \ --conf spark.plugins=com.nvidia.spark.SQLPlugin \ --conf spark.rapids.sql.test.enabled=true \ --conf spark.rapids.sql.explain=ALL --jars dist/target/rapids-4-spark_2.12-24.06.0-SNAPSHOT-cuda11.jar
Related to #9682
The text was updated successfully, but these errors were encountered: