-
Notifications
You must be signed in to change notification settings - Fork 238
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fixed some of the failing parquet_tests [databricks] #11429
Fixed some of the failing parquet_tests [databricks] #11429
Conversation
Signed-off-by: Raza Jafri <[email protected]>
@@ -35,15 +35,19 @@ def read_parquet_df(data_path): | |||
def read_parquet_sql(data_path): | |||
return lambda spark : spark.sql('select * from parquet.`{}`'.format(data_path)) | |||
|
|||
datetimeRebaseModeInWriteKey = 'spark.sql.legacy.parquet.datetimeRebaseModeInWrite' if is_before_spark_400() else 'spark.sql.parquet.datetimeRebaseModeInWrite' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
All of the non-legacy versions of these configs appear to have been added in 3.0.0. Is there a reason we are not just switching over to using them instead?
@@ -47,7 +47,7 @@ case class GpuBatchScanExec( | |||
@transient override lazy val batch: Batch = if (scan == null) null else scan.toBatch | |||
// TODO: unify the equal/hashCode implementation for all data source v2 query plans. | |||
override def equals(other: Any): Boolean = other match { | |||
case other: GpuBatchScanExec => | |||
case other: BatchScanExec => |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why is this being changed? This is a GpuBatchScanExec. We don't want to be equal to non-GPU versions do we?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Right. While debugging I wasn't sure what was causing a failure and looking at 330 shim I changed this and didn't change it back before submitting this PR.
I am adding that change to this PR as well.
build |
The failure in CI seems unrelated
The test only reads a CSV, sorts it and does a row count. It passes locally |
build |
|
build |
This PR contributes towards fixing #11024