You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
He found the root cause, in rare case, we did not set spark.sql.legacy.allowNegativeScaleOfDecimal=true when creating a literal scalar.
If the Spark session already initilized with this config, then the cases can pass.
If no Spark session is initilized and therefore this config value is false, then create literal scalar will fail.
Describe the bug
Spark reports the following error when create lit scalar when generate Decimal(34, -5) data.
Steps/Code to reproduce bug
Case 1, failed
Update
test_greatest
to the following, then run on Spark 311.case 2, passed. Only modify the parameter of the test case to add a
DecimalGen(7, 7)
case 3, failed. Only comment the tail lines.
The error is from:
Expected behavior
A clear and concise description of what you expected to happen.
Environment details (please complete the following information)
Additional context
Detail error is:
The text was updated successfully, but these errors were encountered: