Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WIP] Move timezone check to each operator [databricks] #9482

Closed
Closed
Show file tree
Hide file tree
Changes from 2 commits
Commits
Show all changes
42 commits
Select commit Hold shift + click to select a range
d8e77b2
Add test cases for timezone awarded operators
Oct 19, 2023
3f781a4
Move timezone check to each operator
Oct 19, 2023
d5a6d7a
Merge branch 23.12
Oct 27, 2023
b3fa3ee
Update
Oct 27, 2023
c31b2e3
debug
Oct 27, 2023
a7c8996
debug
Oct 27, 2023
2878c5c
Add timezone test mark
Oct 27, 2023
705f8b5
Minor update
Nov 1, 2023
882b751
Fix failed cmp case on Spark311; Restore a python import; minor changes
Nov 1, 2023
aec893c
Fix failure on Databricks
Nov 2, 2023
7f81644
Update test cases for Databricks
Nov 2, 2023
bcc1f5b
Update test cases for Databricks
Nov 2, 2023
505b72e
Fix delta lake test cases.
Nov 3, 2023
07942ea
Fix delta lake test cases.
Nov 3, 2023
3033bc3
Remove the skip logic when time zone is not UTC
Nov 7, 2023
a852455
Add time zone config to set non-UTC
Nov 7, 2023
0358cd4
Add fallback case for cast_test.py
Nov 7, 2023
f6ccadd
Add fallback case for cast_test.py
Nov 7, 2023
21d5a69
Add fallback case for cast_test.py
Nov 8, 2023
e2aa9da
Add fallback case for cast_test.py
Nov 8, 2023
9eab476
Update split_list
Nov 8, 2023
e231a80
Add fallback case for cast_test.py
Nov 8, 2023
71928a0
Add fallback case for cast_test.py
Nov 8, 2023
ca23932
Add fallback cases for cmp_test.py
Nov 9, 2023
ee60bea
Add fallback tests for json_test.py
firestarman Nov 9, 2023
d403c59
add non_utc fallback for parquet_write qa_select and window_function …
thirtiseven Nov 9, 2023
dd5ad0b
Add fallback tests for conditionals_test.py
winningsix Nov 9, 2023
058e13e
Add fallback cases for collection_ops_test.py
Nov 9, 2023
fc3a678
add fallback tests for date_time_test
thirtiseven Nov 9, 2023
938c649
clean up spark_session.py
thirtiseven Nov 9, 2023
befa39d
Add fallback tests for explain_test and csv_test
winningsix Nov 9, 2023
cf2c621
Update test case
Nov 9, 2023
c298d5f
update test case
Nov 9, 2023
09e772c
Add default value
Nov 10, 2023
f43a8f9
Remove useless is_tz_utc
Nov 10, 2023
5882cc3
Fix fallback cases
Nov 10, 2023
7a53dc2
Add bottom check for time zone; Fix ORC check
Nov 13, 2023
7bd9ef8
By default, ExecCheck do not check UTC time zone
Nov 13, 2023
9817c4e
For common expr like AttributeReference, just skip the UTC checking
Nov 13, 2023
f8505b7
For common expr like AttributeReference, just skip the UTC checking
Nov 13, 2023
fa1c84d
For common expr like AttributeReference, just skip the UTC checking
Nov 13, 2023
fbbbd5b
Update test cases
Nov 14, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
88 changes: 86 additions & 2 deletions integration_tests/src/main/python/date_time_test.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,10 +11,10 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: one line break after license header.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

import pytest
from asserts import assert_gpu_and_cpu_are_equal_collect, assert_gpu_fallback_collect, assert_gpu_and_cpu_error

from data_gen import *
from asserts import assert_gpu_and_cpu_are_equal_collect, assert_gpu_and_cpu_are_equal_sql, assert_gpu_fallback_collect, assert_gpu_and_cpu_error, assert_spark_exception, with_gpu_session
from datetime import date, datetime, timezone
from marks import ignore_order, incompat, allow_non_gpu
from pyspark.sql.types import *
Expand Down Expand Up @@ -558,3 +558,87 @@ def test_timestamp_millis_long_overflow():
def test_timestamp_micros(data_gen):
assert_gpu_and_cpu_are_equal_collect(
lambda spark : unary_op_df(spark, data_gen).selectExpr("timestamp_micros(a)"))


# used by timezone test cases
def get_timezone_df(spark):
schema = StructType([
StructField("ts_str_col", StringType()),
StructField("long_col", LongType()),
StructField("ts_col", TimestampType()),
StructField("date_col", DateType()),
StructField("date_str_col", StringType()),
])
data = [
('1970-01-01 00:00:00', 0, datetime(1970, 1, 1), date(1970, 1, 1), '1970-01-01'),
('1970-01-01 00:00:00', 0, datetime(1970, 1, 1), date(1970, 1, 1), '1970-01-01'),
]
return spark.createDataFrame(SparkContext.getOrCreate().parallelize(data),schema)

# used by timezone test cases, specify all the sqls that will be impacted by non-utc timezone
time_zone_sql_conf_pairs = [
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: There're some functions related to timezone (not supported yet), mentioned in Spark built-in function website. We can add some comments mentioning here.

convert_timezone
-- SELECT convert_timezone('Europe/Brussels', 'America/Los_Angeles', timestamp_ntz'2021-12-06 00:00:00');
current_timezone()
make_timestamp()
make_timestamp_ltz()

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For current_timezone, it just returns the session timezone, we can ignore it for this PR.
Spark config "spark.sql.session.timeZone" can set this value.

For MakeTimestamp and ConvertTimezone, it's recorded in this follow on issue: #9570

("select minute(ts_col) from tab", {}),
("select second(ts_col) from tab", {}),
("select hour(ts_col) from tab", {}),
("select date_col + (interval 10 days 3 seconds) from tab", {}),
("select date_format(ts_col, 'yyyy-MM-dd HH:mm:ss') from tab", {}),
("select unix_timestamp(ts_col) from tab", {"spark.rapids.sql.improvedTimeOps.enabled": "true"}),
("select to_unix_timestamp(ts_str_col) from tab", {"spark.rapids.sql.improvedTimeOps.enabled": "false"}),
("select to_unix_timestamp(ts_col) from tab", {"spark.rapids.sql.improvedTimeOps.enabled": "true"}),
("select to_date(date_str_col, 'yyyy-MM-dd') from tab", {}), # test GpuGetTimestamp
("select to_date(date_str_col) from tab", {}),
("select from_unixtime(long_col, 'yyyy-MM-dd HH:mm:ss') from tab", {}),
("select cast(ts_col as string) from tab", {}), # cast
("select cast(ts_col as date) from tab", {}), # cast
("select cast(date_col as TIMESTAMP) from tab", {}), # cast
("select to_timestamp(ts_str_col) from tab", {"spark.rapids.sql.improvedTimeOps.enabled": "false"}),
("select to_timestamp(ts_str_col) from tab", {"spark.rapids.sql.improvedTimeOps.enabled": "true"}),
]


@allow_non_gpu("ProjectExec")
@pytest.mark.parametrize('sql, extra_conf', time_zone_sql_conf_pairs)
def test_timezone_for_operators_with_non_utc(sql, extra_conf):
# timezone is non-utc, should fallback to CPU
timezone_conf = {"spark.sql.session.timeZone": "+08:00",
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we make the time zone string a param to the test? Just because I would like to test a few more time zones than just +08:00

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

"spark.rapids.sql.hasExtendedYearValues": "false",
"spark.rapids.sql.castStringToTimestamp.enabled": "true"}
all_conf = copy_and_update(timezone_conf, extra_conf)
def gen_sql_df(spark):
df = get_timezone_df(spark)
df.createOrReplaceTempView("tab")
return spark.sql(sql)
assert_gpu_fallback_collect(gen_sql_df, "ProjectExec", all_conf)


@pytest.mark.parametrize('sql, conf', time_zone_sql_conf_pairs)
def test_timezone_for_operators_with_utc(sql, conf):
# timezone is utc, should be supported by GPU
timezone_conf = {"spark.sql.session.timeZone": "UTC",
"spark.rapids.sql.hasExtendedYearValues": "false",
"spark.rapids.sql.castStringToTimestamp.enabled": "true",}
conf = copy_and_update(timezone_conf, conf)
def gen_sql_df(spark):
df = get_timezone_df(spark)
df.createOrReplaceTempView("tab")
return spark.sql(sql)
assert_gpu_and_cpu_are_equal_collect(gen_sql_df, conf)


@allow_non_gpu("ProjectExec")
def test_timezone_for_operator_from_utc_timestamp_with_non_utc():
# timezone is non-utc, should fallback to CPU
def gen_sql_df(spark):
df = get_timezone_df(spark)
df.createOrReplaceTempView("tab")
return spark.sql("select from_utc_timestamp(ts_col, '+08:00') from tab")
assert_gpu_fallback_collect(gen_sql_df, "ProjectExec")


def test_timezone_for_operator_from_utc_timestamp_with_utc():
# timezone is utc, should be supported by GPU
def gen_sql_df(spark):
df = get_timezone_df(spark)
df.createOrReplaceTempView("tab")
return spark.sql("select from_utc_timestamp(ts_col, '+00:00') from tab").collect()
with_gpu_session(gen_sql_df)
2 changes: 1 addition & 1 deletion pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -1045,7 +1045,7 @@
<arg>-Yno-adapted-args</arg>
<arg>-Ywarn-unused:imports,locals,patvars,privates</arg>
<arg>-Xlint:missing-interpolator</arg>
<arg>-Xfatal-warnings</arg>
<!-- <arg>-Xfatal-warnings</arg> -->
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: Revert this back when we try to commit it.

</args>
<jvmArgs>
<jvmArg>-Xms1024m</jvmArg>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -669,9 +669,7 @@ object GpuOverrides extends Logging {
case FloatType => true
case DoubleType => true
case DateType => true
case TimestampType =>
TypeChecks.areTimestampsSupported(ZoneId.systemDefault()) &&
TypeChecks.areTimestampsSupported(SQLConf.get.sessionLocalTimeZone)
case TimestampType => true
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need to consider the timezone check for scan and writer parts? AFAIK, when scanning data from Parquet, spark.sql.session.timeZone is supposed to be respect.

If applies, we should add some python tests as well.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This check is used by InternalColumnarRddConverter and HostToGpuCoalesceIterator.
Coalesce can handle non UTC timestamp. But not sure InternalColumnarRddConverter, seems it's also OK.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we will need to check these. For me, anything that does not have a test that shows it works fully in at least one other time zone must fall back to the CPU if it sees a timestamp that is not UTC.

Parquet for example has the rebase mode for older timestamps that requires knowing the timezone to do properly.

case StringType => true
case dt: DecimalType if allowDecimal => dt.precision <= DType.DECIMAL64_MAX_PRECISION
case NullType => allowNull
Expand Down Expand Up @@ -1655,6 +1653,9 @@ object GpuOverrides extends Logging {
willNotWorkOnGpu("interval months isn't supported")
}
}

// need timezone support, here check timezone
checkTimeZoneId(dateAddInterval.zoneId)
}

override def convertToGpu(lhs: Expression, rhs: Expression): GpuExpression =
Expand All @@ -1668,6 +1669,12 @@ object GpuOverrides extends Logging {
.withPsNote(TypeEnum.STRING, "A limited number of formats are supported"),
TypeSig.STRING)),
(a, conf, p, r) => new UnixTimeExprMeta[DateFormatClass](a, conf, p, r) {

override def tagExprForGpu(): Unit = {
// need timezone support, here check timezone
checkTimeZoneId(a.zoneId)
}

override def convertToGpu(lhs: Expression, rhs: Expression): GpuExpression =
GpuDateFormatClass(lhs, rhs, strfFormat)
}
Expand All @@ -1682,6 +1689,12 @@ object GpuOverrides extends Logging {
.withPsNote(TypeEnum.STRING, "A limited number of formats are supported"),
TypeSig.STRING)),
(a, conf, p, r) => new UnixTimeExprMeta[ToUnixTimestamp](a, conf, p, r) {

override def tagExprForGpu(): Unit = {
// need timezone support, here check timezone
checkTimeZoneId(a.zoneId)
}

override def convertToGpu(lhs: Expression, rhs: Expression): GpuExpression = {
if (conf.isImprovedTimestampOpsEnabled) {
// passing the already converted strf string for a little optimization
Expand All @@ -1701,6 +1714,12 @@ object GpuOverrides extends Logging {
.withPsNote(TypeEnum.STRING, "A limited number of formats are supported"),
TypeSig.STRING)),
(a, conf, p, r) => new UnixTimeExprMeta[UnixTimestamp](a, conf, p, r) {

override def tagExprForGpu(): Unit = {
// need timezone support, here check timezone
checkTimeZoneId(a.zoneId)
}

override def convertToGpu(lhs: Expression, rhs: Expression): GpuExpression = {
if (conf.isImprovedTimestampOpsEnabled) {
// passing the already converted strf string for a little optimization
Expand All @@ -1715,6 +1734,11 @@ object GpuOverrides extends Logging {
ExprChecks.unaryProject(TypeSig.INT, TypeSig.INT,
TypeSig.TIMESTAMP, TypeSig.TIMESTAMP),
(hour, conf, p, r) => new UnaryExprMeta[Hour](hour, conf, p, r) {

override def tagExprForGpu(): Unit = {
// need timezone support, here check timezone
checkTimeZoneId(hour.zoneId)
}

override def convertToGpu(expr: Expression): GpuExpression = GpuHour(expr)
}),
Expand All @@ -1724,6 +1748,11 @@ object GpuOverrides extends Logging {
TypeSig.TIMESTAMP, TypeSig.TIMESTAMP),
(minute, conf, p, r) => new UnaryExprMeta[Minute](minute, conf, p, r) {

override def tagExprForGpu(): Unit = {
// need timezone support, here check timezone
checkTimeZoneId(minute.zoneId)
}

override def convertToGpu(expr: Expression): GpuExpression =
GpuMinute(expr)
}),
Expand All @@ -1733,6 +1762,11 @@ object GpuOverrides extends Logging {
TypeSig.TIMESTAMP, TypeSig.TIMESTAMP),
(second, conf, p, r) => new UnaryExprMeta[Second](second, conf, p, r) {

override def tagExprForGpu(): Unit = {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could we try and have a TimeZoneAwareExprMeta, or something similar that makes it super simple to do this? We might even be able to back it into ExprMeta itself, just by checking if the class that this wraps is also TimeZoneAware.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm guessing the best approach is to put it directly in ExprMeta since otherwise we would have to mixin the TimeZoneAwareExprMeta for the different functions. I'm guessing that functions requiring timezone will span the gamut of Unary/Binary/Ternary/Quaternary/Agg/etc.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe wrap the check in a method and override it whenever a function starts supporting alternate timezones.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like GpuCast will be a first exception to this idea: #6835

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm testing all the existing test cases with adding non-UTC time zone config to identify all the failed cases:

https://github.com/NVIDIA/spark-rapids/blob/v23.10.0/integration_tests/src/main/python/spark_session.py#L68-L74

def _set_all_confs(conf):
    _spark.conf.set("spark.sql.session.timeZone": "+08:00")

Then I'll update the failed cases.

// need timezone support, here check timezone
checkTimeZoneId(second.zoneId)
}

override def convertToGpu(expr: Expression): GpuExpression =
GpuSecond(expr)
}),
Expand Down Expand Up @@ -1767,6 +1801,12 @@ object GpuOverrides extends Logging {
.withPsNote(TypeEnum.STRING, "Only a limited number of formats are supported"),
TypeSig.STRING)),
(a, conf, p, r) => new UnixTimeExprMeta[FromUnixTime](a, conf, p, r) {

override def tagExprForGpu(): Unit = {
// need timezone support, here check timezone
checkTimeZoneId(a.zoneId)
}

override def convertToGpu(lhs: Expression, rhs: Expression): GpuExpression =
// passing the already converted strf string for a little optimization
GpuFromUnixTime(lhs, rhs, strfFormat)
Expand Down
13 changes: 6 additions & 7 deletions sql-plugin/src/main/scala/com/nvidia/spark/rapids/Plugin.scala
Original file line number Diff line number Diff line change
Expand Up @@ -374,13 +374,12 @@ class RapidsExecutorPlugin extends ExecutorPlugin with Logging {
case Some(value) => ZoneId.of(value)
case None => throw new RuntimeException(s"Driver time zone cannot be determined.")
}
if (TypeChecks.areTimestampsSupported(driverTimezone)) {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

may off-topic. Considering the configuration spark.sql.session.timeZone, should both driver and executor respect it? Then do we still need the check on driver and executor's timezone mismatch?

Copy link
Collaborator Author

@res-life res-life Oct 27, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Considering the configuration spark.sql.session.timeZone, should both driver and executor respect it?

Here driverTimezone is from driver ZoneId.systemDefault(), not from spark.sql.session.timeZone, refer to: PR
Spark itself does not have this kind of check.

But for our spark-rapids, we check executor and driver have the same JVM time zone.

Then do we still need the check on driver and executor's timezone mismatch?

I think yes, becasue we want to avoid the issue

val executorTimezone = ZoneId.systemDefault()
if (executorTimezone.normalized() != driverTimezone.normalized()) {
throw new RuntimeException(s" Driver and executor timezone mismatch. " +
s"Driver timezone is $driverTimezone and executor timezone is " +
s"$executorTimezone. Set executor timezone to $driverTimezone.")
}

val executorTimezone = ZoneId.systemDefault()
if (executorTimezone.normalized() != driverTimezone.normalized()) {
throw new RuntimeException(s" Driver and executor timezone mismatch. " +
s"Driver timezone is $driverTimezone and executor timezone is " +
s"$executorTimezone. Set executor timezone to $driverTimezone.")
}

GpuCoreDumpHandler.executorInit(conf, pluginContext)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -1095,11 +1095,6 @@ abstract class BaseExprMeta[INPUT <: Expression](
}
rule.getChecks.foreach(_.tag(this))
tagExprForGpu()
wrapped match {
case tzAware: TimeZoneAwareExpression if needTimezoneTagging =>
checkTimeZoneId(tzAware.zoneId)
case _ => // do nothing
}
}

/**
Expand Down
20 changes: 16 additions & 4 deletions sql-plugin/src/main/scala/com/nvidia/spark/rapids/TypeChecks.scala
Original file line number Diff line number Diff line change
Expand Up @@ -363,8 +363,7 @@ final class TypeSig private(
case FloatType => check.contains(TypeEnum.FLOAT)
case DoubleType => check.contains(TypeEnum.DOUBLE)
case DateType => check.contains(TypeEnum.DATE)
case TimestampType if check.contains(TypeEnum.TIMESTAMP) =>
TypeChecks.areTimestampsSupported()
Copy link
Collaborator Author

@res-life res-life Oct 27, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Originally invoked by shuffle meta, FileFormatChecks, tag AST and other.

  • shuffle meta, it's safe to remove this check, because shuffle definitely support non utc timezone.
  • FileFormatChecks: Spark always write Parqeut with UTC timestamp, it's safe; For ORC, Spark map ORC type timestamp with local time zone to Spark type TIMESTAMP_NTZ (with no time zone). Now spark-rapids does not support TIMESTAMP_NTZ currently, so it's safe to remove the check. Refer to link
  • tag AST: Not sure if remove this UTC check is OK, need to investigate.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

case TimestampType => check.contains(TypeEnum.TIMESTAMP)
case StringType => check.contains(TypeEnum.STRING)
case dt: DecimalType =>
check.contains(TypeEnum.DECIMAL) &&
Expand Down Expand Up @@ -840,7 +839,7 @@ object TypeChecks {
areTimestampsSupported(ZoneId.systemDefault()) &&
areTimestampsSupported(SQLConf.get.sessionLocalTimeZone)
}

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: extra space.

def isTimezoneSensitiveType(dataType: DataType): Boolean = {
dataType == TimestampType
}
Expand Down Expand Up @@ -1502,7 +1501,20 @@ class CastChecks extends ExprChecks {

def gpuCanCast(from: DataType, to: DataType): Boolean = {
val (checks, _) = getChecksAndSigs(from)
checks.isSupportedByPlugin(to)
checks.isSupportedByPlugin(to) && gpuCanCastConsiderTimezone(from, to)
}

def gpuCanCastConsiderTimezone(from: DataType, to: DataType) = {
// need timezone support, here check timezone
(from, to) match {
case (_:StringType, _:TimestampType) => TypeChecks.areTimestampsSupported()
case (_:TimestampType, _:StringType) => TypeChecks.areTimestampsSupported()
case (_:StringType, _:DateType) => TypeChecks.areTimestampsSupported()
case (_:DateType, _:StringType) => TypeChecks.areTimestampsSupported()
case (_:TimestampType, _:DateType) => TypeChecks.areTimestampsSupported()
case (_:DateType, _:TimestampType) => TypeChecks.areTimestampsSupported()
case _ => true
}
}
}

Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
/*
* Copyright (c) 2021-2022, NVIDIA CORPORATION.
* Copyright (c) 2021-2023, NVIDIA CORPORATION.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not touched?

*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
Expand Down Expand Up @@ -38,6 +38,12 @@ object TimeStamp {
.withPsNote(TypeEnum.STRING, "A limited number of formats are supported"),
TypeSig.STRING)),
(a, conf, p, r) => new UnixTimeExprMeta[GetTimestamp](a, conf, p, r) {

override def tagExprForGpu(): Unit = {
// need timezone support, here check timezone
checkTimeZoneId(a.zoneId)
}

override def convertToGpu(lhs: Expression, rhs: Expression): GpuExpression = {
GpuGetTimestamp(lhs, rhs, sparkFormat, strfFormat)
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -1045,6 +1045,7 @@ class FromUTCTimestampExprMeta(
extends BinaryExprMeta[FromUTCTimestamp](expr, conf, parent, rule) {

override def tagExprForGpu(): Unit = {
// need timezone support, here check timezone
extractStringLit(expr.right) match {
case None =>
willNotWorkOnGpu("timezone input must be a literal string")
Expand Down