Client library for NASA CMR and EDL APIs
A Python library to search and access NASA datasets.
Install the latest release:
conda install -c conda-forge earthdata
Or you can clone earthdata
and get started locally
# ensure you have Poetry installed
pip install --user poetry
# install all dependencies (including dev)
poetry install
# develop!
from earthdata import Auth, DataGranules, DataCollections, Store
auth = Auth().login(strategy="netrc") # if we want to access NASA DATA in the cloud
# To search for collecrtions (datasets)
DatasetQuery = DataCollections().keyword('MODIS').bounding_box(-26.85,62.65,-11.86,67.08)
counts = DatasetQuery.hits()
collections = DatasetQuery.get()
# To search for granules (data files)
GranuleQuery = DataGranules().concept_id('C1711961296-LPCLOUD').bounding_box(-10,20,10,50)
# number of granules (data files) that matched our criteria
counts = GranuleQuery.hits()
# We get the metadata
granules = GranuleQuery.get(10)
# earthdata provides some convenience functions for each data granule
data_links = [granule.data_links(access="direct") for granule in granules]
# or if the data is an on-prem dataset
data_links = [granule.data_links(access="onprem") for granule in granules]
# The Store class allows to get the granules from on-prem locations with get()
# NOTE: Some datasets require users to accept a Licence Agreement before accessing them
store = Store(auth)
# This works with both, on-prem or cloud based collections**
store.get(granules, local_path='./data')
# if you're in a AWS instance (us-west-2) you can use open() to get a fileset of S3 files!
fileset = store.open(granules)
# Given that this is gridded data (Level 3 or up) we could
xarray.open_mfdataset(fileset, combine='by_coords')
For more examples see the Demo
and EarthdataSearch
notebooks.
Only Python 3.8+ is supported.
See Code of Conduct
- This repository is not actively supported by NSIDC but we welcome issue submissions and pull requests in order to foster community contribution.
Welcome! 😊👋
Please see the Contributing Guide.