forked from NVIDIA/spark-rapids
-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Fix
collection_ops_tests
for Spark 4.0 [databricks] (NVIDIA#11414)
* Fix collection_ops_tests for Spark 4.0. Fixes NVIDIA#11011. This commit fixes the failures in `collection_ops_tests` on Spark 4.0. On all versions of Spark, when a Sequence is collected with rows that exceed MAX_INT, an exception is thrown indicating that the collected Sequence/array is larger than permissible. The different versions of Spark vary in the contents of the exception message. On Spark 4, one sees that the error message now contains more information than all prior versions, including: 1. The name of the op causing the error 2. The errant sequence size This commit introduces a shim to make this new information available in the exception. Note that this shim does not fit cleanly in RapidsErrorUtils, because there are differences within major Spark versions. For instance, Spark 3.4.0-1 have a different message as compared to 3.4.2 and 3.4.3. Likewise, the differences in 3.5.0, 3.5.1, 3.5.2. Signed-off-by: MithunR <[email protected]> * Fixed formatting error. * Review comments. This moves the construction of the long-sequence error strings into RapidsErrorUtils. The process involved introducing many new RapidsErrorUtils classes, and using mix-ins of concrete implementations for the error-string construction. * Added missing shim tag for 3.5.2. * Review comments: Fixed code style. * Reformatting, per project guideline. * Fixed missed whitespace problem. --------- Signed-off-by: MithunR <[email protected]>
- Loading branch information
Showing
17 changed files
with
392 additions
and
145 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
48 changes: 48 additions & 0 deletions
48
...c/main/spark320/scala/com/nvidia/spark/rapids/shims/SequenceSizeTooLongErrorBuilder.scala
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,48 @@ | ||
/* | ||
* Copyright (c) 2024, NVIDIA CORPORATION. | ||
* | ||
* Licensed under the Apache License, Version 2.0 (the "License"); | ||
* you may not use this file except in compliance with the License. | ||
* You may obtain a copy of the License at | ||
* | ||
* http://www.apache.org/licenses/LICENSE-2.0 | ||
* | ||
* Unless required by applicable law or agreed to in writing, software | ||
* distributed under the License is distributed on an "AS IS" BASIS, | ||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
* See the License for the specific language governing permissions and | ||
* limitations under the License. | ||
*/ | ||
|
||
/*** spark-rapids-shim-json-lines | ||
{"spark": "320"} | ||
{"spark": "321"} | ||
{"spark": "321cdh"} | ||
{"spark": "322"} | ||
{"spark": "323"} | ||
{"spark": "324"} | ||
{"spark": "330"} | ||
{"spark": "330cdh"} | ||
{"spark": "330db"} | ||
{"spark": "331"} | ||
{"spark": "332"} | ||
{"spark": "332cdh"} | ||
{"spark": "332db"} | ||
{"spark": "333"} | ||
{"spark": "340"} | ||
{"spark": "341"} | ||
{"spark": "341db"} | ||
{"spark": "350"} | ||
spark-rapids-shim-json-lines ***/ | ||
package org.apache.spark.sql.rapids.shims | ||
|
||
import org.apache.spark.unsafe.array.ByteArrayMethods.MAX_ROUNDED_ARRAY_LENGTH | ||
|
||
trait SequenceSizeTooLongErrorBuilder { | ||
|
||
def getTooLongSequenceErrorString(sequenceSize: Int, functionName: String): String = { | ||
// For these Spark versions, the sequence length and function name | ||
// do not appear in the exception message. | ||
s"Too long sequence found. Should be <= $MAX_ROUNDED_ARRAY_LENGTH" | ||
} | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
84 changes: 84 additions & 0 deletions
84
.../main/spark330/scala/org/apache/spark/sql/rapids/shims/RapidsErrorUtils330To334Base.scala
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,84 @@ | ||
/* | ||
* Copyright (c) 2024, NVIDIA CORPORATION. | ||
* | ||
* Licensed under the Apache License, Version 2.0 (the "License"); | ||
* you may not use this file except in compliance with the License. | ||
* You may obtain a copy of the License at | ||
* | ||
* http://www.apache.org/licenses/LICENSE-2.0 | ||
* | ||
* Unless required by applicable law or agreed to in writing, software | ||
* distributed under the License is distributed on an "AS IS" BASIS, | ||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
* See the License for the specific language governing permissions and | ||
* limitations under the License. | ||
*/ | ||
|
||
/*** spark-rapids-shim-json-lines | ||
{"spark": "330"} | ||
{"spark": "330cdh"} | ||
{"spark": "331"} | ||
{"spark": "332"} | ||
{"spark": "332cdh"} | ||
{"spark": "333"} | ||
{"spark": "334"} | ||
spark-rapids-shim-json-lines ***/ | ||
package org.apache.spark.sql.rapids.shims | ||
|
||
import org.apache.spark.SparkDateTimeException | ||
import org.apache.spark.sql.catalyst.trees.Origin | ||
import org.apache.spark.sql.errors.QueryExecutionErrors | ||
import org.apache.spark.sql.internal.SQLConf | ||
import org.apache.spark.sql.types.{DataType, Decimal, DecimalType} | ||
|
||
trait RapidsErrorUtils330To334Base extends RapidsErrorUtilsFor330plus with RapidsQueryErrorUtils { | ||
|
||
def mapKeyNotExistError( | ||
key: String, | ||
keyType: DataType, | ||
origin: Origin): NoSuchElementException = { | ||
QueryExecutionErrors.mapKeyNotExistError(key, keyType, origin.context) | ||
} | ||
|
||
def invalidArrayIndexError(index: Int, numElements: Int, | ||
isElementAtF: Boolean = false): ArrayIndexOutOfBoundsException = { | ||
if (isElementAtF) { | ||
QueryExecutionErrors.invalidElementAtIndexError(index, numElements) | ||
} else { | ||
QueryExecutionErrors.invalidArrayIndexError(index, numElements) | ||
} | ||
} | ||
|
||
def arithmeticOverflowError( | ||
message: String, | ||
hint: String = "", | ||
errorContext: String = ""): ArithmeticException = { | ||
QueryExecutionErrors.arithmeticOverflowError(message, hint, errorContext) | ||
} | ||
|
||
def cannotChangeDecimalPrecisionError( | ||
value: Decimal, | ||
toType: DecimalType, | ||
context: String = ""): ArithmeticException = { | ||
QueryExecutionErrors.cannotChangeDecimalPrecisionError( | ||
value, toType.precision, toType.scale, context | ||
) | ||
} | ||
|
||
def overflowInIntegralDivideError(context: String = ""): ArithmeticException = { | ||
QueryExecutionErrors.arithmeticOverflowError( | ||
"Overflow in integral divide", "try_divide", context | ||
) | ||
} | ||
|
||
def sparkDateTimeException(infOrNan: String): SparkDateTimeException = { | ||
// These are the arguments required by SparkDateTimeException class to create error message. | ||
val errorClass = "CAST_INVALID_INPUT" | ||
val messageParameters = Array("DOUBLE", "TIMESTAMP", SQLConf.ANSI_ENABLED.key) | ||
new SparkDateTimeException(errorClass, Array(infOrNan) ++ messageParameters) | ||
} | ||
|
||
def sqlArrayIndexNotStartAtOneError(): RuntimeException = { | ||
new ArrayIndexOutOfBoundsException("SQL array indices start at 1") | ||
} | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
24 changes: 24 additions & 0 deletions
24
sql-plugin/src/main/spark334/scala/org/apache/spark/sql/rapids/shims/RapidsErrorUtils.scala
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,24 @@ | ||
/* | ||
* Copyright (c) 2024, NVIDIA CORPORATION. | ||
* | ||
* Licensed under the Apache License, Version 2.0 (the "License"); | ||
* you may not use this file except in compliance with the License. | ||
* You may obtain a copy of the License at | ||
* | ||
* http://www.apache.org/licenses/LICENSE-2.0 | ||
* | ||
* Unless required by applicable law or agreed to in writing, software | ||
* distributed under the License is distributed on an "AS IS" BASIS, | ||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
* See the License for the specific language governing permissions and | ||
* limitations under the License. | ||
*/ | ||
|
||
/*** spark-rapids-shim-json-lines | ||
{"spark": "334"} | ||
spark-rapids-shim-json-lines ***/ | ||
package org.apache.spark.sql.rapids.shims | ||
|
||
object RapidsErrorUtils extends RapidsErrorUtils330To334Base | ||
with SequenceSizeTooLongUnsuccessfulErrorBuilder | ||
|
35 changes: 35 additions & 0 deletions
35
...scala/org/apache/spark/sql/rapids/shims/SequenceSizeTooLongUnsuccessfulErrorBuilder.scala
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,35 @@ | ||
/* | ||
* Copyright (c) 2024, NVIDIA CORPORATION. | ||
* | ||
* Licensed under the Apache License, Version 2.0 (the "License"); | ||
* you may not use this file except in compliance with the License. | ||
* You may obtain a copy of the License at | ||
* | ||
* http://www.apache.org/licenses/LICENSE-2.0 | ||
* | ||
* Unless required by applicable law or agreed to in writing, software | ||
* distributed under the License is distributed on an "AS IS" BASIS, | ||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
* See the License for the specific language governing permissions and | ||
* limitations under the License. | ||
*/ | ||
|
||
/*** spark-rapids-shim-json-lines | ||
{"spark": "334"} | ||
{"spark": "342"} | ||
{"spark": "343"} | ||
{"spark": "351"} | ||
{"spark": "352"} | ||
spark-rapids-shim-json-lines ***/ | ||
package org.apache.spark.sql.rapids.shims | ||
|
||
import org.apache.spark.unsafe.array.ByteArrayMethods.MAX_ROUNDED_ARRAY_LENGTH | ||
|
||
trait SequenceSizeTooLongUnsuccessfulErrorBuilder { | ||
def getTooLongSequenceErrorString(sequenceSize: Int, functionName: String): String = { | ||
// The errant function's name does not feature in the exception message | ||
// prior to Spark 4.0. Neither does the attempted allocation size. | ||
"Unsuccessful try to create array with elements exceeding the array " + | ||
s"size limit $MAX_ROUNDED_ARRAY_LENGTH" | ||
} | ||
} |
Oops, something went wrong.