Skip to content

Commit

Permalink
gh actions file
Browse files Browse the repository at this point in the history
  • Loading branch information
e-marshall committed Feb 17, 2024
1 parent 2c3d045 commit 5d9b8f3
Show file tree
Hide file tree
Showing 2 changed files with 187 additions and 0 deletions.
62 changes: 62 additions & 0 deletions .github/workflows/deploy.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,62 @@
name: deploy-book

name: deploy-book

# Run this when the master or main branch changes
on:
push:
branches:
- master
- main
# If your git repository has the Jupyter Book within some-subfolder next to
# unrelated files, you can make this run only if a file within that specific
# folder has been modified.
#
# paths:
# - some-subfolder/**

# This job installs dependencies, builds the book, and pushes it to `gh-pages`
jobs:
deploy-book:
runs-on: ubuntu-latest
permissions:
pages: write
id-token: write
steps:
- uses: actions/checkout@v3

# Install dependencies
- name: Set up Python 3.11
uses: actions/setup-python@v4
with:
python-version: 3.11

- name: Install dependencies
run: |
pip install -r requirements.txt
# (optional) Cache your executed notebooks between runs
# if you have config:
# execute:
# execute_notebooks: cache
- name: cache executed notebooks
uses: actions/cache@v3
with:
path: _build/.jupyter_cache
key: jupyter-book-cache-${{ hashFiles('requirements.txt') }}

# Build the book
- name: Build the book
run: |
jupyter-book build .
# Upload the book's HTML as an artifact
- name: Upload artifact
uses: actions/upload-pages-artifact@v2
with:
path: "_build/html"

# Deploy the book's HTML to GitHub Pages
- name: Deploy to GitHub Pages
id: deployment
uses: actions/deploy-pages@v2
125 changes: 125 additions & 0 deletions software.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,125 @@
{
"cells": [
{
"cell_type": "markdown",
"id": "71327c0e",
"metadata": {},
"source": [
"# Software and Data\n",
"\n",
"On this page you'll find information about the computing environment and datasets that we'll be using in this tutorial. \n",
"\n",
"\n",
"## Computing environment\n",
"\n",
"Below, you'lll see a list of the python libraries we'll be using in this example. This is the full list of libraries across all notebooks."
]
},
{
"cell_type": "code",
"execution_count": 1,
"id": "446021e3",
"metadata": {
"vscode": {
"languageId": "python"
}
},
"outputs": [],
"source": [
"import os\n",
"import json\n",
"import urllib.request\n",
"import numpy as np\n",
"import xarray as xr\n",
"import rioxarray as rxr\n",
"import geopandas as gpd\n",
"import pandas as pd\n",
"\n",
"import matplotlib.pyplot as plt\n",
"import matplotlib.ticker as mticker\n",
"\n",
"from shapely.geometry import Polygon\n",
"from shapely.geometry import Point\n",
"import cartopy.crs as ccrs\n",
"from cartopy.mpl.gridliner import LONGITUDE_FORMATTER, LATITUDE_FORMATTER\n",
"import cartopy\n",
"import cartopy.feature as cfeature\n",
"\n",
"from geocube.api.core import make_geocube\n",
"import flox\n",
"import s3fs\n"
]
},
{
"cell_type": "markdown",
"id": "d43c7d3b",
"metadata": {},
"source": [
"This tutorial also uses several functions that are stored in the script [`itslivetools.py`](https://github.com/e-marshall/itslive/blob/master/itslivetools.py). It is located in the github repo for this tutorial. If you clone the repo, it should be available to import to the tutorial notebooks. Otherwise, if you would like to use `itslivetools.py`, download the script and move it to your working directory. "
]
},
{
"cell_type": "markdown",
"id": "35049ecb",
"metadata": {},
"source": [
"## Running tutorial on the cloud\n",
"\n",
"This link will launch a preconfigured jupyterlab environment on mybinder.org:\n",
"\n",
"[https://mybinder.org/v2/gh/e-marshall/itslive/HEAD?labpath=accessing_s3_data.ipynb](https://mybinder.org/v2/gh/e-marshall/itslive/HEAD?labpath=accessing_s3_data.ipynb)\n",
"\n",
"## Running tutorial material locally\n",
"\n",
"To run the notebooks contained in this tutorial on your local machine\n",
"\n",
"create the `itslivetools_env` conda environment (`conda env create -f environment-unpinned.yml`) based on the `environment.yml` file [here](https://github.com/e-marshall/mynewbook/blob/master/environment.yml). This should work on any platform (linux, osx, windows) and will install the latest versions of all dependencies.\n",
"\n",
"Alternatively, the code repository for this tutorial (https://github.com/e-marshall/itslive) also contains \"lock\" files for Linux (conda-linux-64.lock.yml) and MacOS (conda-osx-64.lock.yml) that pin exact versions of all required python packages for a [reproducible computing environment](https://mybinder.readthedocs.io/en/latest/tutorials/reproducibility.html)."
]
},
{
"cell_type": "markdown",
"id": "29b773e4",
"metadata": {},
"source": [
"## Data\n",
"\n",
"The velocity data that we'll be using is from the [ITS_LIVE dataset](https://its-live.jpl.nasa.gov/#access). This dataset contains global coverage of land ice velocity data at various temporal frequencies and in various formats. Follow the link to explore the data that's available for a particular region you may be interested in. **ITS_LIVE** has multiple options for data access; this example will focus on using zarr datacubes that are stored in s3 buckets on AWS.\n",
"\n",
"**ITS_LIVE** velocity data is accessed in a raster format and the data covers a large swath of terrain covering land that is glaciated and non-glaciated. We want to select just the pixels that cover glaciated surfaces; to do this, we use glacier outlines from the [Randolph Glacier Inventory](https://www.glims.org/RGI/). The RGI region used in this tutorial is made available as a [GeoParquet](https://geoparquet.org/) file in the tutorial [repository](https://github.com/e-marshall/itslive/blob/master/rgi7_region15_south_asia_east.parquet). \n",
"\n",
"Head to the next page to see how we start accessing and working with this data \n"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "38108842-c077-4174-84c5-7cc44cc3c6f8",
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.3"
}
},
"nbformat": 4,
"nbformat_minor": 5
}

0 comments on commit 5d9b8f3

Please sign in to comment.