Skip to content

Commit

Permalink
Development environment updates, io, and docs (#4)
Browse files Browse the repository at this point in the history
* add funtion to api docs

* update env, add sliderule io

* optional imports, more docs

* matplotlib optional

* github actions updates

* github actions updates

* fix links

* temp disable pixi cache

* actually disable cache, add optional pkgs to docs

* better optional imports

* better optional imports

* try locked=false

* back to lockfile

* better wesm remote search

* folium for docs

* OGR aws sessions management

* set GDAL ENV via pyogrio

* disable pam
  • Loading branch information
scottyhq authored Oct 31, 2024
1 parent dad5b63 commit 3960508
Show file tree
Hide file tree
Showing 29 changed files with 3,853 additions and 4,683 deletions.
12 changes: 0 additions & 12 deletions .copier-answers.yml

This file was deleted.

3 changes: 0 additions & 3 deletions .git_archival.txt

This file was deleted.

3 changes: 0 additions & 3 deletions .gitattributes

This file was deleted.

12 changes: 5 additions & 7 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -30,9 +30,6 @@ jobs:
- uses: pre-commit/[email protected]
with:
extra_args: --hook-stage manual --all-files
- name: Run PyLint
continue-on-error: true
run: pipx run nox -s pylint -- --output-format=github

checks:
name: Check Python ${{ matrix.python-version }} on ${{ matrix.runs-on }}
Expand Down Expand Up @@ -68,7 +65,8 @@ jobs:
python -m pytest -ra --cov --cov-report=xml --cov-report=term
--durations=20
- name: Upload coverage report
uses: codecov/[email protected]
with:
token: ${{ secrets.CODECOV_TOKEN }}
# NOTE: need an account for this...
# - name: Upload coverage report
# uses: codecov/[email protected]
# with:
# token: ${{ secrets.CODECOV_TOKEN }}
14 changes: 11 additions & 3 deletions .github/workflows/pixi.yml
Original file line number Diff line number Diff line change
Expand Up @@ -15,16 +15,24 @@ jobs:

- uses: prefix-dev/[email protected]
with:
pixi-version: v0.34.0
environments: dev
cache: true
cache-write:
${{ github.event_name == 'push' && github.ref_name == 'main' }}
manifest-path: pyproject.toml
cache: false
locked: false
#cache-write:
# ${{ github.event_name == 'push' && github.ref_name == 'main' }}

# NOTE: https://github.com/prefix-dev/setup-pixi/issues/136
- name: Ensure Dynamic version
run: |
pip install -e .
- name: Run Pylint
continue-on-error: true
run: |
pixi run lint
- name: Run Pytest
env:
MAXAR_API_KEY: ${{ secrets.MAXAR_API_KEY}}
Expand Down
48 changes: 48 additions & 0 deletions .github/workflows/upstream.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
# Test against the latest versions of Python and other libraries
name: Upstream versions

on:
workflow_dispatch:

concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true

env:
# Many color libraries just need this to be set to any value, but at least
# one distinguishes color depth, where "3" -> "256-bit color".
FORCE_COLOR: 3

jobs:
checks:
name: Check Python ${{ matrix.python-version }} on ${{ matrix.runs-on }}
runs-on: ${{ matrix.runs-on }}
strategy:
fail-fast: false
matrix:
python-version: ["3.13", "3.14"]
# windows-latest,
runs-on: [ubuntu-latest, macos-14]
#include:
# - python-version: "pypy-3.10"
# runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0

- uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
allow-prereleases: true

- name: Install package
run: python -m pip install .[dev]

- name: Test package
env:
MAXAR_API_KEY: ${{ secrets.MAXAR_API_KEY}}
run: >-
python -m pytest -ra --cov --cov-report=xml --cov-report=term
--durations=20
10 changes: 10 additions & 0 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -87,3 +87,13 @@ repos:
- id: check-dependabot
- id: check-github-workflows
- id: check-readthedocs

# https://pylint.pycqa.org/en/latest/user_guide/installation/pre-commit-integration.html#pre-commit-integration
# - repo: local
# hooks:
# - id: pylint
# name: pylint
# entry: pylint
# language: system
# types: [python]
# require_serial: true
26 changes: 19 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,8 +27,10 @@ Such datasets are intended to be used by the NASA STV community for
calibration/validation, fusion algorithm development, and discipline-specific
scientific analysis.

See here for more information:
<https://science.nasa.gov/earth-science/decadal-surveys/decadal-stv/coincident-datasets>

**This tool is under active development, there are no stable releases yet!**
https://science.nasa.gov/earth-science/decadal-surveys/decadal-stv/coincident-datasets/

## Development

Expand All @@ -41,8 +43,9 @@ git checkout -b newfeature
pixi shell --environment dev # type `exit` to deactivate
pre-commit install

# Or run pre-configured environments and commands
pixi run test
# Or run pre-configured commands:
pixi run networktest # or 'test'
pixi run precommit # also runs automatically upon commits
pixi run lint
pixi run docs
```
Expand All @@ -54,12 +57,21 @@ authentication to _download_ data (NASA). `coincident` assumes you have the
following Environment Variables defined:

```bash
export EARTHDATA_USERNAME=xxxxx
export EARTHDATA_PASSWORD=yyyyy
export MAXAR_API_KEY=zzzzz
export EARTHDATA_USERNAME=aaaaa
export EARTHDATA_PASSWORD=bbbbb
export MAXAR_API_KEY=ccccc
export PC_SDK_SUBSCRIPTION_KEY=ddddd
```

Sign up for credentials at the following webpages:

- [](https://urs.earthdata.nasa.gov)
- [](https://developers.maxar.com/docs/authentication/guides/api-key)
- [](https://planetarycomputer.developer.azure-api.net)

### Acknowledgements

- Python packaging template provided by
https://github.com/scientific-python/cookie
<https://github.com/scientific-python/cookie>

- Funding for this effort was provided by NASA Grant 80NSSC22K1094
2 changes: 1 addition & 1 deletion docs/api.rst
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,6 @@ Search
wesm.get_swath_polygons



Overlaps
--------

Expand All @@ -29,6 +28,7 @@ Overlaps
:toctree: generated/

geographic_area
subset_by_minimum_area
subset_by_temporal_overlap


Expand Down
21 changes: 21 additions & 0 deletions docs/datasets.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
# Supported datasets

Below we provide a short table summarizing datasets that are searchable with
`coincident`. Note that many of these datasets (or subsets) are available from
different providers, the _provider_ column identifies the source of the data
used by this library.

| Dataset | Alias | Type | Start | End | Extent | Source |
| -------------- | ---------- | --------- | ---------- | ---------- | ------------- | --------------------------------------------------------------------------- |
| TanDEM-X | tdx | SAR | 2007-07-01 | | global | [NASA CSDAP](https://csdap.earthdata.nasa.gov/stac/collections/airbus) |
| Maxar Stereo | maxar | VHR | 2007-07-01 | | global | [Maxar](https://developers.maxar.com/docs/discovery/) |
| Coperincus DEM | cop30 | SAR | 2021-04-22 | | global | [Microsoft](https://planetarycomputer.microsoft.com/dataset/cop-dem-glo-30) |
| ICESat-2 ATL06 | atl06 | Altimeter | 2018-10-13 | | global | [NASA](https://nsidc.org/data/atl03) |
| GEDI L2A | gedi | Altimeter | 2019-04-04 | 2023-03-17 | mid-latitudes | [NASA](https://lpdaac.usgs.gov/products/gedi02_av002/) |
| 3DEP LiDAR | 3dep | LiDAR | 2000-12-01 | | CONUS | [USGS](https://www.usgs.gov/3d-elevation-program) |
| ESA WorldCover | worldcover | LULC | 2020-01-01 | 2021-12-31 | global | [Microsoft](https://planetarycomputer.microsoft.com/dataset/esa-worldcover) |

## Other data sources

If you are interested in working with additional data, feel free to open an
[issue](https://github.com/uw-cryo/coincident/issues).
9 changes: 9 additions & 0 deletions docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,15 @@
installation
introduction
datasets
```

```{toctree}
:maxdepth: 2
:hidden:
:caption: Examples
quickstart
```

```{toctree}
Expand Down
4 changes: 2 additions & 2 deletions docs/installation.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,8 @@ coincident directly from GitHub:
pip install git+https://github.com/uw-cryo/coincident.git@main
```

Alternatively, you can install a fresh locked environment with
[pixi.sh](https://pixi.sh/latest/):
Alternatively, you can install a fresh locked environment using the
[GitHub CLI](https://cli.github.com) and [pixi.sh](https://pixi.sh/latest/):

```bash
gh repo clone uw-cryo/coincident
Expand Down
27 changes: 21 additions & 6 deletions docs/introduction.md
Original file line number Diff line number Diff line change
@@ -1,27 +1,42 @@
# Introduction

`coincident` simplifies access to a curated set of datasets of relevance to NASA
STV studies
STV studies. It is designed to simplify working with disparate metadata for
areal and satellite remote sensing datasets. `coincident` relies heavily on
[GeoPandas](https://geopandas.org/en/stable/index.html) in that metadata records
are always returned as GeoDataFrame objects, and most methods are written to
operate either on entire dataframes or single rows within a dataframe.

## Dataset aliases

```python
import coincident

coincident.datasets.aliases
```

the `coincident` package provides a `search()` method that has the same syntax
regardless of which dataset you are searching. Behind the scenes, polygons
intersecting your area of interest are efficiently located and returned as a
geodataframe.
## Unified search function

the `coincident` package provides a [search()](#coincident.search.search) method
that has the same syntax regardless of which dataset you are searching. Behind
the scenes, polygons intersecting your area of interest are efficiently located
and returned as a geodataframe.

```python
aoi = gpd.read_file(
"https://raw.githubusercontent.com/unitedstates/districts/refs/heads/gh-pages/states/CO/shape.geojson"
)
gf = coincident.search.search(
gf = coincident.search(
dataset="3dep",
intersects=aoi,
datetime=["2018", "2024"],
)
gf.explore(column="workunit", popup=True)
```

## Convenience functions

`coincident` also provides a number of convenience functions, some of which only
pertain to specific datasets. For example, loading raster imagery via
[Xarray](https://docs.xarray.dev/en/stable) or creating visualizations of browse
imagery. Refer to [the API Docs](./api) for a listing of functions.
Loading

0 comments on commit 3960508

Please sign in to comment.