Skip to content

Commit

Permalink
Merge branch 'main' into normalize-axis-index
Browse files Browse the repository at this point in the history
  • Loading branch information
keewis committed Jan 12, 2024
2 parents be6918e + 08c8f9a commit 2e4a4c3
Show file tree
Hide file tree
Showing 36 changed files with 972 additions and 359 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/benchmarks-last-release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,7 @@ jobs:
cp benchmarks/README_CI.md benchmarks.log .asv/results/
working-directory: ${{ env.ASV_DIR }}

- uses: actions/upload-artifact@v3
- uses: actions/upload-artifact@v4
if: always()
with:
name: asv-benchmark-results-${{ runner.os }}
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/benchmarks.yml
Original file line number Diff line number Diff line change
Expand Up @@ -67,7 +67,7 @@ jobs:
cp benchmarks/README_CI.md benchmarks.log .asv/results/
working-directory: ${{ env.ASV_DIR }}

- uses: actions/upload-artifact@v3
- uses: actions/upload-artifact@v4
if: always()
with:
name: asv-benchmark-results-${{ runner.os }}
Expand Down
13 changes: 6 additions & 7 deletions .github/workflows/ci-additional.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -320,11 +320,6 @@ jobs:
run:
shell: bash -l {0}

strategy:
matrix:
environment-file: ["bare-minimum", "min-all-deps"]
fail-fast: false

steps:
- uses: actions/checkout@v4
with:
Expand All @@ -340,6 +335,10 @@ jobs:
conda
python-dateutil
- name: minimum versions policy
- name: All-deps minimum versions policy
run: |
python ci/min_deps_check.py ci/requirements/min-all-deps.yml
- name: Bare minimum versions policy
run: |
python ci/min_deps_check.py ci/requirements/${{ matrix.environment-file }}.yml
python ci/min_deps_check.py ci/requirements/bare-minimum.yml
8 changes: 5 additions & 3 deletions .github/workflows/ci.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,8 @@ jobs:
runs-on: ${{ matrix.os }}
needs: detect-ci-trigger
if: needs.detect-ci-trigger.outputs.triggered == 'false'
env:
ZARR_V3_EXPERIMENTAL_API: 1
defaults:
run:
shell: bash -l {0}
Expand Down Expand Up @@ -127,9 +129,9 @@ jobs:

- name: Upload test results
if: always()
uses: actions/upload-artifact@v3
uses: actions/upload-artifact@v4
with:
name: Test results for ${{ runner.os }}-${{ matrix.python-version }}
name: Test results for ${{ runner.os }}-${{ matrix.python-version }} ${{ matrix.env }}
path: pytest.xml

- name: Upload code coverage to Codecov
Expand All @@ -147,7 +149,7 @@ jobs:
if: github.repository == 'pydata/xarray'
steps:
- name: Upload
uses: actions/upload-artifact@v3
uses: actions/upload-artifact@v4
with:
name: Event File
path: ${{ github.event_path }}
8 changes: 4 additions & 4 deletions .github/workflows/pypi-release.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ jobs:
else
echo "✅ Looks good"
fi
- uses: actions/upload-artifact@v3
- uses: actions/upload-artifact@v4
with:
name: releases
path: dist
Expand All @@ -54,7 +54,7 @@ jobs:
name: Install Python
with:
python-version: "3.11"
- uses: actions/download-artifact@v3
- uses: actions/download-artifact@v4
with:
name: releases
path: dist
Expand Down Expand Up @@ -82,7 +82,7 @@ jobs:
id-token: write

steps:
- uses: actions/download-artifact@v3
- uses: actions/download-artifact@v4
with:
name: releases
path: dist
Expand All @@ -106,7 +106,7 @@ jobs:
id-token: write

steps:
- uses: actions/download-artifact@v3
- uses: actions/download-artifact@v4
with:
name: releases
path: dist
Expand Down
3 changes: 2 additions & 1 deletion .github/workflows/upstream-dev-ci.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -37,6 +37,8 @@ jobs:
name: upstream-dev
runs-on: ubuntu-latest
needs: detect-ci-trigger
env:
ZARR_V3_EXPERIMENTAL_API: 1
if: |
always()
&& (
Expand Down Expand Up @@ -82,7 +84,6 @@ jobs:
if: success()
id: status
run: |
export ZARR_V3_EXPERIMENTAL_API=1
python -m pytest --timeout=60 -rf \
--report-log output-${{ matrix.python-version }}-log.jsonl
- name: Generate and publish the report
Expand Down
6 changes: 3 additions & 3 deletions ci/requirements/bare-minimum.yml
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,6 @@ dependencies:
- pytest-env
- pytest-xdist
- pytest-timeout
- numpy=1.22
- packaging=21.3
- pandas=1.4
- numpy=1.23
- packaging=22.0
- pandas=1.5
35 changes: 19 additions & 16 deletions ci/requirements/min-all-deps.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,30 +10,35 @@ dependencies:
- python=3.9
- boto3=1.24
- bottleneck=1.3
- cartopy=0.20
- cartopy=0.21
- cftime=1.6
- coveralls
- dask-core=2022.7
- distributed=2022.7
- flox=0.5
- dask-core=2022.12
- distributed=2022.12
# Flox > 0.8 has a bug with numbagg versions
# It will require numbagg > 0.6
# so we should just skip that series eventually
# or keep flox pinned for longer than necessary
- flox=0.7
- h5netcdf=1.1
# h5py and hdf5 tend to cause conflicts
# for e.g. hdf5 1.12 conflicts with h5py=3.1
# prioritize bumping other packages instead
- h5py=3.7
- hdf5=1.12
- hypothesis
- iris=3.2
- iris=3.4
- lxml=4.9 # Optional dep of pydap
- matplotlib-base=3.5
- matplotlib-base=3.6
- nc-time-axis=1.4
# netcdf follows a 1.major.minor[.patch] convention
# (see https://github.com/Unidata/netcdf4-python/issues/1090)
- netcdf4=1.6.0
- numba=0.55
- numpy=1.22
- packaging=21.3
- pandas=1.4
- numba=0.56
- numbagg=0.2.1
- numpy=1.23
- packaging=22.0
- pandas=1.5
- pint=0.22
- pip
- pydap=3.3
Expand All @@ -43,11 +48,9 @@ dependencies:
- pytest-xdist
- pytest-timeout
- rasterio=1.3
- scipy=1.8
- seaborn=0.11
- scipy=1.10
- seaborn=0.12
- sparse=0.13
- toolz=0.12
- typing_extensions=4.3
- zarr=2.12
- pip:
- numbagg==0.2.1
- typing_extensions=4.4
- zarr=2.13
3 changes: 1 addition & 2 deletions doc/ecosystem.rst
Original file line number Diff line number Diff line change
Expand Up @@ -98,7 +98,6 @@ Visualization
Non-Python projects
~~~~~~~~~~~~~~~~~~~
- `xframe <https://github.com/xtensor-stack/xframe>`_: C++ data structures inspired by xarray.
- `AxisArrays <https://github.com/JuliaArrays/AxisArrays.jl>`_ and
`NamedArrays <https://github.com/davidavdav/NamedArrays.jl>`_: similar data structures for Julia.
- `AxisArrays <https://github.com/JuliaArrays/AxisArrays.jl>`_, `NamedArrays <https://github.com/davidavdav/NamedArrays.jl>`_ and `YAXArrays.jl <https://github.com/JuliaDataCubes/YAXArrays.jl>`_: similar data structures for Julia.

More projects can be found at the `"xarray" Github topic <https://github.com/topics/xarray>`_.
22 changes: 6 additions & 16 deletions doc/user-guide/groupby.rst
Original file line number Diff line number Diff line change
Expand Up @@ -177,28 +177,18 @@ This last line is roughly equivalent to the following::
results.append(group - alt.sel(letters=label))
xr.concat(results, dim='x')

Squeezing
~~~~~~~~~
Iterating and Squeezing
~~~~~~~~~~~~~~~~~~~~~~~

When grouping over a dimension, you can control whether the dimension is
squeezed out or if it should remain with length one on each group by using
the ``squeeze`` parameter:

.. ipython:: python
next(iter(arr.groupby("x")))
Previously, Xarray defaulted to squeezing out dimensions of size one when iterating over
a GroupBy object. This behaviour is being removed.
You can always squeeze explicitly later with the Dataset or DataArray
:py:meth:`~xarray.DataArray.squeeze` methods.

.. ipython:: python
next(iter(arr.groupby("x", squeeze=False)))
Although xarray will attempt to automatically
:py:attr:`~xarray.DataArray.transpose` dimensions back into their original order
when you use apply, it is sometimes useful to set ``squeeze=False`` to
guarantee that all original dimensions remain unchanged.

You can always squeeze explicitly later with the Dataset or DataArray
:py:meth:`~xarray.DataArray.squeeze` methods.
.. _groupby.multidim:

Expand Down
31 changes: 28 additions & 3 deletions doc/whats-new.rst
Original file line number Diff line number Diff line change
Expand Up @@ -34,16 +34,39 @@ New Features
Breaking changes
~~~~~~~~~~~~~~~~

- The minimum versions of some dependencies were changed (:pull:`8586`):

===================== ========= ========
Package Old New
===================== ========= ========
cartopy 0.20 0.21
dask-core 2022.7 2022.12
distributed 2022.7 2022.12
flox 0.5 0.7
iris 3.2 3.4
matplotlib-base 3.5 3.6
numpy 1.22 1.23
numba 0.55 0.56
packaging 21.3 22.0
seaborn 0.11 0.12
scipy 1.8 1.10
typing_extensions 4.3 4.4
zarr 2.12 2.13
===================== ========= ========


Deprecations
~~~~~~~~~~~~

- The `squeeze` kwarg to GroupBy is now deprecated. (:issue:`2157`, :pull:`8507`)
By `Deepak Cherian <https://github.com/dcherian>`_.

Bug fixes
~~~~~~~~~

- Reverse index output of bottleneck's rolling move_argmax/move_argmin functions (:issue:`8541`, :pull:`8552`).
By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_.
- Vendor `SerializableLock` from dask and use as default lock for netcdf4 backends (:issue:`8442`, :pull:`8571`).
By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_.


Documentation
Expand All @@ -52,7 +75,10 @@ Documentation

Internal Changes
~~~~~~~~~~~~~~~~

- The implementation of :py:func:`map_blocks` has changed to minimize graph size and duplication of data.
This should be a strict improvement even though the graphs are not always embarassingly parallel any more.
Please open an issue if you spot a regression. (:pull:`8412`, :issue:`8409`).
By `Deepak Cherian <https://github.com/dcherian>`_.
- Remove null values before plotting. (:pull:`8535`).
By `Jimmy Westling <https://github.com/illviljan>`_.

Expand Down Expand Up @@ -116,7 +142,6 @@ Breaking changes

Deprecations
~~~~~~~~~~~~

- As part of an effort to standardize the API, we're renaming the ``dims``
keyword arg to ``dim`` for the minority of functions which current use
``dims``. This started with :py:func:`xarray.dot` & :py:meth:`DataArray.dot`
Expand Down
7 changes: 4 additions & 3 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -22,9 +22,9 @@ readme = "README.md"
requires-python = ">=3.9"

dependencies = [
"numpy>=1.22",
"packaging>=21.3",
"pandas>=1.4",
"numpy>=1.23",
"packaging>=22",
"pandas>=1.5",
]

[project.urls]
Expand Down Expand Up @@ -91,6 +91,7 @@ module = [
"cf_units.*",
"cfgrib.*",
"cftime.*",
"cloudpickle.*",
"cubed.*",
"cupy.*",
"dask.types.*",
Expand Down
Loading

0 comments on commit 2e4a4c3

Please sign in to comment.