Skip to content

Commit

Permalink
Merge branch 'main' into numpy2-array-api
Browse files Browse the repository at this point in the history
  • Loading branch information
keewis authored Apr 28, 2024
2 parents aa3dea8 + 214d941 commit 52c61ab
Show file tree
Hide file tree
Showing 43 changed files with 917 additions and 1,252 deletions.
1 change: 0 additions & 1 deletion .binder/environment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,6 @@ dependencies:
- pip
- pooch
- pydap
- pynio
- rasterio
- scipy
- seaborn
Expand Down
1 change: 1 addition & 0 deletions MANIFEST.in
Original file line number Diff line number Diff line change
@@ -1 +1,2 @@
prune xarray/datatree_*
recursive-include xarray/datatree_/datatree *.py
4 changes: 2 additions & 2 deletions ci/requirements/bare-minimum.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,5 +12,5 @@ dependencies:
- pytest-xdist
- pytest-timeout
- numpy=1.23
- packaging=22.0
- pandas=1.5
- packaging=23.1
- pandas=2.0
22 changes: 11 additions & 11 deletions ci/requirements/min-all-deps.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,13 +9,13 @@ dependencies:
# doc/user-guide/installing.rst, doc/user-guide/plotting.rst and setup.py.
- python=3.9
- array-api-strict=1.0 # dependency for testing the array api compat
- boto3=1.24
- boto3=1.26
- bottleneck=1.3
- cartopy=0.21
- cftime=1.6
- coveralls
- dask-core=2022.12
- distributed=2022.12
- dask-core=2023.4
- distributed=2023.4
# Flox > 0.8 has a bug with numbagg versions
# It will require numbagg > 0.6
# so we should just skip that series eventually
Expand All @@ -25,24 +25,24 @@ dependencies:
# h5py and hdf5 tend to cause conflicts
# for e.g. hdf5 1.12 conflicts with h5py=3.1
# prioritize bumping other packages instead
- h5py=3.7
- h5py=3.8
- hdf5=1.12
- hypothesis
- iris=3.4
- lxml=4.9 # Optional dep of pydap
- matplotlib-base=3.6
- matplotlib-base=3.7
- nc-time-axis=1.4
# netcdf follows a 1.major.minor[.patch] convention
# (see https://github.com/Unidata/netcdf4-python/issues/1090)
- netcdf4=1.6.0
- numba=0.56
- numbagg=0.2.1
- numpy=1.23
- packaging=22.0
- pandas=1.5
- packaging=23.1
- pandas=2.0
- pint=0.22
- pip
- pydap=3.3
- pydap=3.4
- pytest
- pytest-cov
- pytest-env
Expand All @@ -51,7 +51,7 @@ dependencies:
- rasterio=1.3
- scipy=1.10
- seaborn=0.12
- sparse=0.13
- sparse=0.14
- toolz=0.12
- typing_extensions=4.4
- zarr=2.13
- typing_extensions=4.5
- zarr=2.14
3 changes: 0 additions & 3 deletions doc/getting-started-guide/installing.rst
Original file line number Diff line number Diff line change
Expand Up @@ -31,9 +31,6 @@ For netCDF and IO
- `pydap <https://www.pydap.org>`__: used as a fallback for accessing OPeNDAP
- `h5netcdf <https://github.com/h5netcdf/h5netcdf>`__: an alternative library for
reading and writing netCDF4 files that does not use the netCDF-C libraries
- `PyNIO <https://www.pyngl.ucar.edu/Nio.shtml>`__: for reading GRIB and other
geoscience specific file formats. Note that PyNIO is not available for Windows and
that the PyNIO backend may be moved outside of xarray in the future.
- `zarr <https://zarr.readthedocs.io>`__: for chunked, compressed, N-dimensional arrays.
- `cftime <https://unidata.github.io/cftime>`__: recommended if you
want to encode/decode datetimes for non-standard calendars or dates before
Expand Down
21 changes: 0 additions & 21 deletions doc/user-guide/io.rst
Original file line number Diff line number Diff line change
Expand Up @@ -1294,27 +1294,6 @@ We recommend installing cfgrib via conda::

.. _cfgrib: https://github.com/ecmwf/cfgrib

.. _io.pynio:

Formats supported by PyNIO
--------------------------

.. warning::

The `PyNIO backend is deprecated`_. `PyNIO is no longer maintained`_.

Xarray can also read GRIB, HDF4 and other file formats supported by PyNIO_,
if PyNIO is installed. To use PyNIO to read such files, supply
``engine='pynio'`` to :py:func:`open_dataset`.

We recommend installing PyNIO via conda::

conda install -c conda-forge pynio

.. _PyNIO: https://www.pyngl.ucar.edu/Nio.shtml
.. _PyNIO backend is deprecated: https://github.com/pydata/xarray/issues/4491
.. _PyNIO is no longer maintained: https://github.com/NCAR/pynio/issues/53


CSV and other formats supported by pandas
-----------------------------------------
Expand Down
34 changes: 30 additions & 4 deletions doc/whats-new.rst
Original file line number Diff line number Diff line change
Expand Up @@ -29,9 +29,31 @@ New Features
for example, will retain the object. However, one cannot do operations that are not possible on the `ExtensionArray`
then, such as broadcasting.
By `Ilan Gold <https://github.com/ilan-gold>`_.
- Added the option to avoid automatically creating 1D pandas indexes in :py:meth:`Dataset.expand_dims()`, by passing the new kwarg
`create_index=False`. (:pull:`8960`)
By `Tom Nicholas <https://github.com/TomNicholas>`_.

Breaking changes
~~~~~~~~~~~~~~~~
- The PyNIO backend has been deleted (:issue:`4491`, :pull:`7301`).
By `Deepak Cherian <https://github.com/dcherian>`_.

- The minimum versions of some dependencies were changed, in particular our minimum supported pandas version is now Pandas 2.

===================== ========= =======
Package Old New
===================== ========= =======
dask-core 2022.12 2023.4
distributed 2022.12 2023.4
h5py 3.7 3.8
matplotlib-base 3.6 3.7
packaging 22.0 23.1
pandas 1.5 2.0
pydap 3.3 3.4
sparse 0.13 0.14
typing_extensions 4.4 4.5
zarr 2.13 2.14
===================== ========= =======


Bug fixes
Expand All @@ -40,12 +62,17 @@ Bug fixes

Internal Changes
~~~~~~~~~~~~~~~~
- Migrates ``formatting_html`` functionality for `DataTree` into ``xarray/core`` (:pull: `8930`)
- Migrates ``formatting_html`` functionality for ``DataTree`` into ``xarray/core`` (:pull: `8930`)
By `Eni Awowale <https://github.com/eni-awowale>`_, `Julia Signell <https://github.com/jsignell>`_
and `Tom Nicholas <https://github.com/TomNicholas>`_.
- Migrates ``datatree_mapping`` functionality into ``xarray/core`` (:pull:`8948`)
By `Matt Savoie <https://github.com/flamingbear>`_ `Owen Littlejohns
<https://github.com/owenlittlejohns>` and `Tom Nicholas <https://github.com/TomNicholas>`_.
<https://github.com/owenlittlejohns>`_ and `Tom Nicholas <https://github.com/TomNicholas>`_.
- Migrates ``extensions``, ``formatting`` and ``datatree_render`` functionality for
``DataTree`` into ``xarray/core``. Also migrates ``testing`` functionality into
``xarray/testing/assertions`` for ``DataTree``. (:pull:`8967`)
By `Owen Littlejohns <https://github.com/owenlittlejohns>`_ and
`Tom Nicholas <https://github.com/TomNicholas>`_.


.. _whats-new.2024.03.0:
Expand Down Expand Up @@ -6806,8 +6833,7 @@ Enhancements
datasets with a MultiIndex to a netCDF file. User contributions in this
area would be greatly appreciated.

- Support for reading GRIB, HDF4 and other file formats via PyNIO_. See
:ref:`io.pynio` for more details.
- Support for reading GRIB, HDF4 and other file formats via PyNIO_.
- Better error message when a variable is supplied with the same name as
one of its dimensions.
- Plotting: more control on colormap parameters (:issue:`642`). ``vmin`` and
Expand Down
11 changes: 3 additions & 8 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -24,8 +24,8 @@ requires-python = ">=3.9"

dependencies = [
"numpy>=1.23",
"packaging>=22",
"pandas>=1.5",
"packaging>=23.1",
"pandas>=2.0",
]

[project.optional-dependencies]
Expand Down Expand Up @@ -88,7 +88,7 @@ exclude_lines = ["pragma: no cover", "if TYPE_CHECKING"]
enable_error_code = "redundant-self"
exclude = [
'xarray/util/generate_.*\.py',
'xarray/datatree_/.*\.py',
'xarray/datatree_/doc/.*\.py',
]
files = "xarray"
show_error_codes = true
Expand All @@ -97,11 +97,6 @@ warn_redundant_casts = true
warn_unused_configs = true
warn_unused_ignores = true

# Ignore mypy errors for modules imported from datatree_.
[[tool.mypy.overrides]]
ignore_errors = true
module = "xarray.datatree_.*"

# Much of the numerical computing stack doesn't have type annotations yet.
[[tool.mypy.overrides]]
ignore_missing_imports = true
Expand Down
2 changes: 0 additions & 2 deletions xarray/backends/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,6 @@
from xarray.backends.netCDF4_ import NetCDF4BackendEntrypoint, NetCDF4DataStore
from xarray.backends.plugins import list_engines, refresh_engines
from xarray.backends.pydap_ import PydapBackendEntrypoint, PydapDataStore
from xarray.backends.pynio_ import NioDataStore
from xarray.backends.scipy_ import ScipyBackendEntrypoint, ScipyDataStore
from xarray.backends.store import StoreBackendEntrypoint
from xarray.backends.zarr import ZarrBackendEntrypoint, ZarrStore
Expand All @@ -30,7 +29,6 @@
"InMemoryDataStore",
"NetCDF4DataStore",
"PydapDataStore",
"NioDataStore",
"ScipyDataStore",
"H5NetCDFStore",
"ZarrStore",
Expand Down
19 changes: 9 additions & 10 deletions xarray/backends/api.py
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@
T_NetcdfEngine = Literal["netcdf4", "scipy", "h5netcdf"]
T_Engine = Union[
T_NetcdfEngine,
Literal["pydap", "pynio", "zarr"],
Literal["pydap", "zarr"],
type[BackendEntrypoint],
str, # no nice typing support for custom backends
None,
Expand All @@ -79,7 +79,6 @@
"scipy": backends.ScipyDataStore,
"pydap": backends.PydapDataStore.open,
"h5netcdf": backends.H5NetCDFStore.open,
"pynio": backends.NioDataStore,
"zarr": backends.ZarrStore.open_group,
}

Expand Down Expand Up @@ -420,8 +419,8 @@ def open_dataset(
ends with .gz, in which case the file is gunzipped and opened with
scipy.io.netcdf (only netCDF3 supported). Byte-strings or file-like
objects are opened by scipy.io.netcdf (netCDF3) or h5py (netCDF4/HDF).
engine : {"netcdf4", "scipy", "pydap", "h5netcdf", "pynio", \
"zarr", None}, installed backend \
engine : {"netcdf4", "scipy", "pydap", "h5netcdf", "zarr", None}\
, installed backend \
or subclass of xarray.backends.BackendEntrypoint, optional
Engine to use when reading files. If not provided, the default engine
is chosen based on available dependencies, with a preference for
Expand Down Expand Up @@ -523,7 +522,7 @@ def open_dataset(
relevant when using dask or another form of parallelism. By default,
appropriate locks are chosen to safely read and write files with the
currently active dask scheduler. Supported by "netcdf4", "h5netcdf",
"scipy", "pynio".
"scipy".
See engine open function for kwargs accepted by each specific engine.
Expand Down Expand Up @@ -627,8 +626,8 @@ def open_dataarray(
ends with .gz, in which case the file is gunzipped and opened with
scipy.io.netcdf (only netCDF3 supported). Byte-strings or file-like
objects are opened by scipy.io.netcdf (netCDF3) or h5py (netCDF4/HDF).
engine : {"netcdf4", "scipy", "pydap", "h5netcdf", "pynio", \
"zarr", None}, installed backend \
engine : {"netcdf4", "scipy", "pydap", "h5netcdf", "zarr", None}\
, installed backend \
or subclass of xarray.backends.BackendEntrypoint, optional
Engine to use when reading files. If not provided, the default engine
is chosen based on available dependencies, with a preference for
Expand Down Expand Up @@ -728,7 +727,7 @@ def open_dataarray(
relevant when using dask or another form of parallelism. By default,
appropriate locks are chosen to safely read and write files with the
currently active dask scheduler. Supported by "netcdf4", "h5netcdf",
"scipy", "pynio".
"scipy".
See engine open function for kwargs accepted by each specific engine.
Expand Down Expand Up @@ -897,8 +896,8 @@ def open_mfdataset(
If provided, call this function on each dataset prior to concatenation.
You can find the file-name from which each dataset was loaded in
``ds.encoding["source"]``.
engine : {"netcdf4", "scipy", "pydap", "h5netcdf", "pynio", \
"zarr", None}, installed backend \
engine : {"netcdf4", "scipy", "pydap", "h5netcdf", "zarr", None}\
, installed backend \
or subclass of xarray.backends.BackendEntrypoint, optional
Engine to use when reading files. If not provided, the default engine
is chosen based on available dependencies, with a preference for
Expand Down
Loading

0 comments on commit 52c61ab

Please sign in to comment.