Skip to content

Commit

Permalink
Merge branch 'main' into engine-flow
Browse files Browse the repository at this point in the history
  • Loading branch information
JessicaS11 authored Jun 25, 2024
2 parents da9e806 + e4061f2 commit 5870819
Show file tree
Hide file tree
Showing 23 changed files with 423 additions and 370 deletions.
2 changes: 1 addition & 1 deletion .devcontainer/Dockerfile
Original file line number Diff line number Diff line change
@@ -1 +1 @@
FROM pangeo/base-image:2024.06.02
FROM pangeo/base-image:2024.06.24
1 change: 1 addition & 0 deletions .github/dependabot.yml
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
# Regularly update Docker tags and Actions steps
version: 2
updates:
- package-ecosystem: "docker"
Expand Down
46 changes: 29 additions & 17 deletions .github/workflows/main.yaml
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
name: CI
name: Deploy Website to GitHub Pages

on:
push:
branches: main
pull_request:
branches: main
paths-ignore:
- ".devcontainer/**"

# Sets permissions of the GITHUB_TOKEN to allow deployment to GitHub Pages
permissions:
Expand All @@ -18,19 +18,19 @@ concurrency:
cancel-in-progress: true

jobs:
build-and-deploy:
build:
runs-on: ubuntu-latest

steps:
- name: Checkout repository
uses: actions/checkout@v3
uses: actions/checkout@v4

- name: Setup JupyterBook Cache
uses: actions/cache@v3
uses: actions/cache@v4
with:
path: _build
# NOTE: change key to "jupyterbook-DATE" to force rebuilding cache
key: jupyterbook-20230707
key: jupyterbook-20240517

- name: Install Conda environment with Micromamba
uses: mamba-org/setup-micromamba@v1
Expand All @@ -50,19 +50,31 @@ jobs:
run: |
if (test -a _build/html/reports/*log); then cat _build/html/reports/*log ; fi
- name: Save Build
- name: Save Build Folder
if: always()
uses: actions/upload-artifact@v3
uses: actions/upload-artifact@v4
with:
name: build
path: _build/

- name: Publish to GitHub Pages
if: github.ref == 'refs/heads/main'
uses: peaceiris/actions-gh-pages@v3
- name: Upload Pages Artifact
uses: actions/upload-pages-artifact@v3
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: _build/html
publish_branch: gh-pages
cname: tutorial.xarray.dev
enable_jekyll: false
path: _build/html

# Publish Website to GitHub Pages if built successfully
deploy:
needs: build
if: github.ref == 'refs/heads/main'
runs-on: ubuntu-latest
environment:
name: github-pages
url: ${{ steps.deployment.outputs.page_url }}

steps:
- name: Setup Pages
uses: actions/configure-pages@v5

- name: Deploy to GitHub Pages
id: deployment
uses: actions/deploy-pages@v4
64 changes: 0 additions & 64 deletions .github/workflows/preview.yaml

This file was deleted.

55 changes: 55 additions & 0 deletions .github/workflows/pull_request.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,55 @@
name: Pull Request Build

on:
pull_request:
types: [opened, synchronize, reopened, closed]
paths-ignore:
- ".devcontainer/**"

concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true

jobs:
preview:
runs-on: ubuntu-latest
defaults:
run:
shell: bash -el {0}
steps:
- name: Checkout repository
if: github.event.action != 'closed'
uses: actions/checkout@v4

- name: Setup JupyterBook Cache
if: github.event.action != 'closed'
uses: actions/cache@v4
with:
path: _build
# NOTE: change key to "jupyterbook-DATE" to force rebuilding cache
key: jupyterbook-20240517

- name: Install Conda environment with Micromamba
if: github.event.action != 'closed'
uses: mamba-org/setup-micromamba@v1
with:
environment-file: conda/conda-lock.yml
environment-name: xarray-tutorial
cache-environment: true

- name: Build JupyterBook
if: github.event.action != 'closed'
run: |
jupyter-book build ./ --warningiserror --keep-going
- name: Dump Build Logs
if: github.event.action != 'closed'
run: |
if (test -a _build/html/reports/*log); then cat _build/html/reports/*log ; fi
- name: Upload artifact
if: github.event.action != 'closed'
uses: actions/upload-artifact@v4
with:
name: html
path: _build/html
4 changes: 3 additions & 1 deletion .github/workflows/qaqc.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,8 @@ on:
pull_request:
branches:
- main
paths-ignore:
- ".devcontainer/**"

concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
Expand All @@ -17,7 +19,7 @@ jobs:
shell: bash -el {0}

steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4

- name: Install Conda environment with Micromamba
uses: mamba-org/setup-micromamba@v1
Expand Down
42 changes: 42 additions & 0 deletions .github/workflows/surge_preview.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
name: Pull Request Preview

on:
workflow_run:
workflows: ["Pull Request Build"]
types:
- completed

permissions:
pull-requests: write # allow surge-preview to create/update PR comments

concurrency:
group: ${{ github.workflow }}-${{ github.event.workflow_run.id }}
cancel-in-progress: true

jobs:
# NOTE: match job name in pull_request.yaml
preview:
runs-on: ubuntu-latest
if: ${{ github.event.workflow_run.event == 'pull_request' && github.event.workflow_run.conclusion == 'success' }}

steps:
# Ensure folder exists for PR 'closed' case
- run: mkdir html

# Download built HTML from PR Build workflow
- uses: actions/download-artifact@v4
continue-on-error: true
with:
github-token: ${{ github.token }}
run-id: ${{ github.event.workflow_run.id }}

- name: Manage Surge.sh Deployment
id: preview_step
uses: afc163/surge-preview@v1
with:
surge_token: ${{ secrets.SURGE_TOKEN }}
github_token: ${{ secrets.GITHUB_TOKEN }}
build: echo 'Uploading html/ folder contents to Surge.sh...'
dist: html # NOTE: match upload_artifact name in pull_request.yaml
failOnError: true
teardown: true
2 changes: 1 addition & 1 deletion .prettierignore
Original file line number Diff line number Diff line change
@@ -1 +1 @@
conda/**
conda/
13 changes: 6 additions & 7 deletions advanced/apply_ufunc/automatic-vectorizing-numpy.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -63,18 +63,17 @@
" out[index, :] = np.interp(..., array[index, :], ...)\n",
"```\n",
"\n",
"\n",
"```{exercise}\n",
":label: coreloopdims\n",
"\n",
"::::{admonition} Exercise\n",
":class: tip\n",
"Consider the example problem of interpolating a 2D array with dimensions `space` and `time` along the `time` dimension.\n",
"Which dimension is the core dimension, and which is the \"loop dimension\"?\n",
"```\n",
"```{solution} coreloopdims\n",
"\n",
":::{admonition} Solution\n",
":class: dropdown\n",
"\n",
"`time` is the core dimension, and `space` is the loop dimension.\n",
"```\n",
":::\n",
"::::\n",
"\n",
"## Vectorization\n",
"\n",
Expand Down
23 changes: 12 additions & 11 deletions advanced/apply_ufunc/complex-output-numpy.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -138,19 +138,20 @@
"tags": []
},
"source": [
"```{exercise}\n",
":label: newdim\n",
"::::{admonition} Exercise\n",
":class: tip\n",
"\n",
"Apply the following function using `apply_ufunc`. It adds a new dimension to the input array, let's call it `newdim`. Specify the new dimension using `output_core_dims`. Do you need any `input_core_dims`?\n",
"\n",
"```python\n",
"def add_new_dim(array):\n",
" return np.expand_dims(array, axis=-1)\n",
"```\n",
"````{solution} newdim\n",
"\n",
":::{admonition} Solution\n",
":class: dropdown\n",
"\n",
"``` python\n",
"```python\n",
"def add_new_dim(array):\n",
" return np.expand_dims(array, axis=-1)\n",
"\n",
Expand All @@ -161,7 +162,8 @@
" output_core_dims=[[\"newdim\"]],\n",
")\n",
"```\n",
"````"
":::\n",
"::::"
]
},
{
Expand Down Expand Up @@ -327,8 +329,8 @@
"tags": []
},
"source": [
"````{exercise}\n",
":label: generalize\n",
"::::{admonition} Exercise\n",
":class: tip\n",
"\n",
"We presented the concept of \"core dimensions\" as the \"smallest unit of data the function could handle.\" Do you understand how the above use of `apply_ufunc` generalizes to an array with more than one dimension? \n",
"\n",
Expand All @@ -337,9 +339,8 @@
"air3d = xr.tutorial.load_dataset(\"air_temperature\").air)\n",
"``` \n",
"Your goal is to have a minimum and maximum value of temperature across all latitudes for a given time and longitude.\n",
"````\n",
"\n",
"````{solution} generalize\n",
":::{admonition} Solution\n",
":class: dropdown\n",
"\n",
"We want to use `minmax` to compute the minimum and maximum along the \"lat\" dimension always, regardless of how many dimensions are on the input. So we specify `input_core_dims=[[\"lat\"]]`. The output does not contain the \"lat\" dimension, but we expect two returned variables. So we pass an empty list `[]` for each returned array, so `output_core_dims=[[], []]` just as before.\n",
Expand All @@ -352,8 +353,8 @@
" input_core_dims=[[\"lat\"]],\n",
" output_core_dims=[[],[]],\n",
")\n",
"```\n",
"````"
":::\n",
"::::"
]
}
],
Expand Down
10 changes: 5 additions & 5 deletions advanced/apply_ufunc/core-dimensions.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -335,13 +335,12 @@
"tags": []
},
"source": [
"```{exercise}\n",
":label: trapezoid\n",
"::::{admonition} Exercise\n",
":class: tip\n",
"\n",
"Use `apply_ufunc` to apply `scipy.integrate.trapezoid` along the `time` axis.\n",
"```\n",
"\n",
"````{solution} trapezoid\n",
":::{admonition} Solution\n",
":class: dropdown\n",
"\n",
"```python\n",
Expand All @@ -350,7 +349,8 @@
"\n",
"xr.apply_ufunc(scipy.integrate.trapezoid, ds, input_core_dims=[[\"time\"]], kwargs={\"axis\": -1})\n",
"```\n",
"````"
":::\n",
"::::"
]
}
],
Expand Down
Loading

0 comments on commit 5870819

Please sign in to comment.