diff --git a/doc/source/example_notebooks/IS2_cloud_data_access.ipynb b/doc/source/example_notebooks/IS2_cloud_data_access.ipynb index fa0931c8a..aee3ca960 100644 --- a/doc/source/example_notebooks/IS2_cloud_data_access.ipynb +++ b/doc/source/example_notebooks/IS2_cloud_data_access.ipynb @@ -12,8 +12,16 @@ "## Notes\n", "1. ICESat-2 data became publicly available on the cloud on 29 September 2022. Thus, access methods and example workflows are still being developed by NSIDC, and the underlying code in icepyx will need to be updated now that these data (and the associated metadata) are available. We appreciate your patience and contributions (e.g. reporting bugs, sharing your code, etc.) during this transition!\n", "2. This example and the code it describes are part of ongoing development. Current limitations to using these features are described throughout the example, as appropriate.\n", - "3. You **MUST** be working within an AWS instance. Otherwise, you will get a permissions error.\n", - "4. Cloud authentication is still more user-involved than we'd like. We're working to address this - let us know if you'd like to join the conversation!" + "3. You **MUST** be working within an AWS instance. Otherwise, you will get a permissions error." + ] + }, + { + "cell_type": "markdown", + "metadata": { + "user_expressions": [] + }, + "source": [ + "## Querying for data and finding s3 urls" ] }, { @@ -28,9 +36,11 @@ }, { "cell_type": "markdown", - "metadata": {}, + "metadata": { + "user_expressions": [] + }, "source": [ - "Create an icepyx Query object" + "We will start the way we often do: by creating an icepyx Query object." ] }, { @@ -39,8 +49,6 @@ "metadata": {}, "outputs": [], "source": [ - "# bounding box\n", - "# \"producerGranuleId\": \"ATL03_20191130221008_09930503_004_01.h5\",\n", "short_name = 'ATL03'\n", "spatial_extent = [-45, 58, -35, 75]\n", "date_range = ['2019-11-30','2019-11-30']" @@ -52,16 +60,19 @@ "metadata": {}, "outputs": [], "source": [ - "reg=ipx.Query(short_name, spatial_extent, date_range)" + "reg = ipx.Query(short_name, spatial_extent, date_range)" ] }, { "cell_type": "markdown", - "metadata": {}, + "metadata": { + "tags": [], + "user_expressions": [] + }, "source": [ - "## Get the granule s3 urls\n", - "You must specify `cloud=True` to get the needed s3 urls.\n", - "This function returns a list containing the list of the granule IDs and a list of the corresponding urls." + "### Get the granule s3 urls\n", + "\n", + "With this query object you can get a list of available granules. This function returns a list containing the list of the granule IDs and a list of the corresponding urls. Use `cloud=True` to get the needed s3 urls." ] }, { @@ -80,20 +91,29 @@ "user_expressions": [] }, "source": [ - "## Log in to Earthdata and generate an s3 token\n", - "You can use icepyx's existing login functionality to generate your s3 data access token, which will be valid for *one* hour. The icepyx module will renew the token for you after an hour, but if viewing your token over the course of several hours you may notice the values will change.\n", + "## Determining variables of interest\n", "\n", - "You can access your s3 credentials using:" + "**Note: If you get a PermissionDenied Error when trying to read in the data, you may not be sending your request from an AWS hub in us-west2. We're currently working on how to alert users if they will not be able to access ICESat-2 data in the cloud for this reason**" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "user_expressions": [] + }, + "source": [ + "There are several ways to view availble variables. One is to use the existing Query object:" ] }, { "cell_type": "code", "execution_count": null, - "metadata": {}, + "metadata": { + "tags": [] + }, "outputs": [], "source": [ - "# uncommenting the line below will print your temporary login credentials\n", - "# reg.s3login_credentials" + "reg.order_vars.avail()" ] }, { @@ -102,18 +122,74 @@ "user_expressions": [] }, "source": [ - "```{admonition} Important Authentication Update\n", - "Previously, icepyx required you to explicitly use the `.earthdata_login()` function to login. Running this function is no longer required, as icepyx will call the login function as needed. The user will still need to provide their credentials using one of the three methods decribed in the [ICESat-2 Data Access Notebook](https://icepyx.readthedocs.io/en/latest/example_notebooks/IS2_data_access.html) example. The `.earthdata_login()` function is still available for backwards compatibility.\n", + "Another way is to use the variables module:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "tags": [] + }, + "outputs": [], + "source": [ + "ipx.Variables(product=short_name).avail()" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "user_expressions": [] + }, + "source": [ + "We can also do this using a specific s3 filepath from the Query object:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "tags": [] + }, + "outputs": [], + "source": [ + "ipx.Variables(path=gran_ids[1][0]).avail()" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "user_expressions": [] + }, + "source": [ + "From any of these methods we can see that `h_li` is a variable for this data product, so we will read that variable in the next step." + ] + }, + { + "cell_type": "markdown", + "metadata": { + "user_expressions": [] + }, + "source": [ + "#### A Note on listing variables using s3 urls\n", "\n", - "If you are unable to remove `earthdata_login()` calls from your workflow, note that certain inputs, such as `earthdata_uid` and `email`, are no longer required. e.g. `region_a.earthdata_login(earthdata_uid, email)` becomes `region_a.earthdata_login()`\n", - "```" + "We can use the Variables module with an s3 url to explore available data variables the same way we do with local files. An important difference, however, is how the available variables list is created. When reading a local file the variables module will traverse the entire file and search for variables that are present in that file. This method it too time intensive with the s3 data, so instead the the product / version of the data product is read from the file and all possible variables associated with that product/version are reporting as available. As long as you are using the NSIDC provided s3 paths provided via Earthdata search and the Query object these lists will be the same." ] }, { "cell_type": "markdown", - "metadata": {}, + "metadata": { + "tags": [], + "user_expressions": [] + }, "source": [ - "## Set up your s3 file system using your credentials" + "#### A Note on authentication\n", + "\n", + "Notice that accessing cloud data requires two layers of authentication: 1) authenticating with your Earthdata Login 2) authenticating for cloud access. These both happen behind the scenes, without the need for users to provide any explicit commands.\n", + "\n", + "Icepyx uses earthaccess to generate your s3 data access token, which will be valid for *one* hour. The earthaccess module will also renew the token for you after an hour, but if viewing your token over the course of several hours you may notice the values will change.\n", + "\n", + "If you do want to see your s3 credentials, you can access them using:" ] }, { @@ -122,17 +198,34 @@ "metadata": {}, "outputs": [], "source": [ - "s3 = earthaccess.get_s3fs_session(daac='NSIDC', provider=reg.s3login_credentials)" + "# uncommenting the line below will print your temporary aws login credentials\n", + "# reg.auth._s3login_credentials" ] }, { "cell_type": "markdown", - "metadata": {}, + "metadata": { + "user_expressions": [] + }, "source": [ - "## Select an s3 url and access the data\n", - "Data read in capabilities for cloud data are coming soon in icepyx (targeted Spring 2023). Stay tuned and we'd love for you to join us and contribute!\n", + "```{admonition} Important Authentication Update\n", + "Previously, icepyx required you to explicitly use the `.earthdata_login()` function to login. Running this function is no longer required, as icepyx will call the login function as needed. The user will still need to provide their credentials using one of the three methods decribed in the [ICESat-2 Data Access Notebook](https://icepyx.readthedocs.io/en/latest/example_notebooks/IS2_data_access.html) example. The `.earthdata_login()` function is still available for backwards compatibility.\n", "\n", - "**Note: If you get a PermissionDenied Error when trying to read in the data, you may not be sending your request from an AWS hub in us-west2. We're currently working on how to alert users if they will not be able to access ICESat-2 data in the cloud for this reason**" + "If you are unable to remove `earthdata_login()` calls from your workflow, note that certain inputs, such as `earthdata_uid` and `email`, are no longer required. e.g. `region_a.earthdata_login(earthdata_uid, email)` becomes `region_a.earthdata_login()`\n", + "```" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "user_expressions": [] + }, + "source": [ + "## Choose a data file and access the data\n", + "\n", + "**Note: If you get a PermissionDenied Error when trying to read in the data, you may not be sending your request from an AWS hub in us-west2. We're currently working on how to alert users if they will not be able to access ICESat-2 data in the cloud for this reason**\n", + "\n", + "We are ready to read our data! We do this by creating a reader object and using the s3 url returned from the Query object." ] }, { @@ -147,40 +240,134 @@ "# s3url = 's3://nsidc-cumulus-prod-protected/ATLAS/ATL03/004/2019/11/30/ATL03_20191130221008_09930503_004_01.h5'" ] }, + { + "cell_type": "markdown", + "metadata": { + "tags": [], + "user_expressions": [] + }, + "source": [ + "Create the Read object" + ] + }, { "cell_type": "code", "execution_count": null, - "metadata": {}, + "metadata": { + "tags": [] + }, "outputs": [], "source": [ - "import h5py\n", - "import numpy as np" + "reader = ipx.Read(s3url)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "user_expressions": [] + }, + "source": [ + "This reader object gives us yet another way to view available variables." ] }, { "cell_type": "code", "execution_count": null, - "metadata": {}, + "metadata": { + "tags": [] + }, "outputs": [], "source": [ - "%time f = h5py.File(s3.open(s3url,'rb'),'r')" + "reader.vars.avail()" ] }, { "cell_type": "markdown", + "metadata": { + "user_expressions": [] + }, + "source": [ + "Next, we append our desired variable to the `wanted_vars` list:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "tags": [] + }, + "outputs": [], + "source": [ + "reader.vars.append(var_list=['h_li'])" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "user_expressions": [] + }, + "source": [ + "Finally, we load the data" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "tags": [] + }, + "outputs": [], + "source": [ + "%%time\n", + "\n", + "# This may take 5-10 minutes\n", + "reader.load()" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "user_expressions": [] + }, + "source": [ + "### Some important caveats\n", + "\n", + "While the cloud data reading is functional within icepyx, it is very slow. Approximate timing shows it takes ~6 minutes of load time per variable per file from s3. Because of this you will recieve a warning if you try to load either more than three variables or two files at once.\n", + "\n", + "The slow load speed is a demonstration of the many steps involved in making cloud data actionable - the data supply chain needs optimized source data, efficient low level data readers, and high level libraries which are enabled to use the fastest low level data readers. Not all of these pieces fully developed right now, but the progress being made it exciting and there is lots of room for contribution!" + ] + }, + { + "cell_type": "code", + "execution_count": null, "metadata": {}, + "outputs": [], + "source": [] + }, + { + "cell_type": "markdown", + "metadata": { + "user_expressions": [] + }, "source": [ "#### Credits\n", - "* notebook by: Jessica Scheick\n", + "* notebook by: Jessica Scheick and Rachel Wegener\n", "* historic source material: [is2-nsidc-cloud.py](https://gist.github.com/bradlipovsky/80ab6a7aff3d3524b9616a9fc176065e#file-is2-nsidc-cloud-py-L28) by Brad Lipovsky" ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] } ], "metadata": { "kernelspec": { - "display_name": "Python 3 (ipykernel)", + "display_name": "icepyx-dev", "language": "python", - "name": "python3" + "name": "icepyx-dev" }, "language_info": { "codemirror_mode": { @@ -192,7 +379,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.10.12" + "version": "3.11.4" } }, "nbformat": 4,