You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I was using save_axs_table() to save the new ZTF data and used the path= argument to put it on the new disk drives. Spark then deleted everything from the path I gave it, including a bunch of unrelated directories and all the ZTF data I was trying to import.
The text was updated successfully, but these errors were encountered:
Let's leave this open until we have more of a safeguard. I agree that it's how spark works, but I lost a few days of work to this and it could be much worse if someone specifies their home directory, for example, so I think it's a very risky thing to expose to the user.
I was using
save_axs_table()
to save the new ZTF data and used thepath=
argument to put it on the new disk drives. Spark then deleted everything from the path I gave it, including a bunch of unrelated directories and all the ZTF data I was trying to import.The text was updated successfully, but these errors were encountered: