-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update to xarray 2023.1.0 #303
base: master
Are you sure you want to change the base?
Conversation
May help speed up copying.
xarray behaviour changed, see pydata/xarray#6999 and links from there.
Updates to xarray, meant that 'index' values were not updated when 'coordinate' values were updated. Fix by explicitly updating indexes.
This is the last release that supports python 3.9
Accessing e.g. `ds.dims['x']` produces: ``` FutureWarning: The return type of `Dataset.dims` will be changed to return a set of dimension names in future, in order to be more consistent with `DataArray.dims`. To access a mapping from dimension names to lengths, please use `Dataset.sizes`. ``` This commit changes `dims` to `sizes`.
Xarray's open_mfdataset is very slow for BOUT++ datasets. Other datasets also have issues (see e.g. pydata/xarray#1385) though the cause may not be the same. Using an implementation that opens the datasets and concatenates significantly speeds up this process.
Using the same concatenation options as in custom implementation. Has essentially the same performance, much faster than original.
Unused import (numpy.unique) Duplicated keyword black formatting
Hello @bendudson! Thanks for updating this PR. We checked the lines you've touched for PEP 8 issues, and found: There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻 Comment last updated at 2024-11-07 09:45:27 UTC |
Mainly lines too long.
_BOUT_PER_PROC_VARIABLES are no longer used in xbout, since duplicated variables are just taken from the first processor.
Timing info is now taken from the first processor, rather than dropped.
By eye there are three subplots as required, but there are also some spurious axes. Unclear to me how to fix, so skipping for now.
xarray 2024.07.0 requires python >= 3.9
Tests pass locally, all failing on github actions. Investigating... |
Reduce minimum xarray to 2023.1.0 Python 3.8 is EoL, but only just: Oct 7, 2024.
`_trim` no longer removes `wtime` timing info.
Previously were scalar `xarray.DataArray`, which caused `linspace` to fail due to missing dimensions
Currently, 3.13 is hanging when running these tests
Last commit didn't rerun tests for some reason. 3.12 is failing with a segfault (!!):
|
Thanks @ZedThree ! The segfault is strange. Something in NetCDF? Possibly related to file locking? I was also geting segfaults (on python 3.9) when passing |
I've not checked which test it is exactly, but 3.13 hangs on |
Tests seem to run about a factor of 2 slower than previously (e.g. https://github.com/boutproject/xBOUT/actions/runs/6983119192/usage about 30 mins). I think it's still worth merging so we can gradually address the performance issues. |
Here is a log from valgrind and python 3.13: In this case it did not crash and did run for 2 hours ... The first error is from netCDF:
Also in matplotlib and numpy there are errors, but most are from netCDF. It might well be that netCDF does not initialise memory and numpy and matplotlib later use that memory.
|
Any chance you could build netcdf locally with debug flags? |
Let me try again to install debug symbols ... |
I think I got the debug info installed, a new build is running: |
I have debug symbols for netCDF working, if you want others I can also have them installed ... |
And python3.13 symbols, in case they are helpful: |
Here is another test-run without LTO for python3-netCDF4 as well as with locking disabled for HDF5: That looks pretty much the same, thus I think it is worth it to escalate it to the netcdf team ... |
I've removed the A complete failure that kills python entirely:
An HDF5 error (via netCDF) that raises an exception:
and a bit later when reporting captured warnings:
And then maybe an memory issue in
I say maybe because the test doesn't actually seem to fail and there are lots of these uninitialised value errors generated. Both the explicit errors are in Next step really is to reproduce it locally somehow. Maybe start with the files it's using and see if there's anything weird with them |
Builds on @johnomotani 's #276
I think this also replaces @mikekryjak 's #289 : Any variables, including wtime variables, that are defined on multiple processors are just taken from processor 0.
Changing options passed to
xarray.open_mfdataset
significantly speeds up opening datasets.Example 6x faster (3 seconds vs 18):
Previously:
Now:
Upgrades to the most recent version of xarray that still supports python 3.8. note 3.8 is end of life as of Oct 7, 2024.