You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
ion-elgreco
changed the title
Issue with reading timestamps from databricks Delta tables
Issue with reading timestamps from spark Delta tables
Jan 23, 2025
@kejtos this because spark still uses an old type for timestamps (int96) which is discouraged. You should set a proper spark config, to prevent your tables getting the wrong timestamp.
Environment
Delta-rs version: 0.24.0
Binding: Python
Environment:
Bug
What happened:
The following code on Databricks:
results in the following error:
ComputeError: ArrowInvalid: Casting from timestamp[ns] to timestamp[us, tz=UTC] would lose data: -number
What you expected to happen:
Reading the timestamp in the original format, or coerce the casting (I do not need such precision anyway).
How to reproduce it:
Run any code on databricks DeltaTable that reads timestamp[ns] columns. E.g.,
or
or
The text was updated successfully, but these errors were encountered: