Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Spark] Fix Utils.isTesting calls to use Delta implementation #4074

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

felipepessoto
Copy link
Contributor

@felipepessoto felipepessoto commented Jan 18, 2025

Which Delta project/connector is this regarding?

  • Spark
  • Standalone
  • Flink
  • Kernel
  • Other (fill in here)

Description

Some classes were using the org.apache.spark.util.Utils instead of the org.apache.spark.sql.delta.util.Utils

The Spark implementation uses the SPARK_TESTING environment variable, while the Delta implementation uses the DELTA_TESTING key.

Spark:
https://github.com/apache/spark/blob/51fb84a54982719209c19136b1d72d2ef44726ee/core/src/main/scala/org/apache/spark/util/Utils.scala#L1878

Delta:

System.getenv("DELTA_TESTING") != null

It means the unit tests are currently running non test code path, because we only set the DELTA_TESTING:

delta/build.sbt

Line 466 in 221d95c

Test / envVars += ("DELTA_TESTING", "1"),

run_cmd(["python3", python_test_script], env={'DELTA_TESTING': '1'}, stream_output=True)

How was this patch tested?

Unit tests

Does this PR introduce any user-facing changes?

No

@felipepessoto felipepessoto changed the title [Spark] Fix Utils.isTesting to use Delta implementation [Spark] Fix Utils.isTesting calls to use Delta implementation Jan 18, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant