diff --git a/CHANGELOG.md b/CHANGELOG.md
index c5ee95ac..2252ba95 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -39,6 +39,8 @@
([#508](https://github.com/nsidc/earthaccess/issues/508))
* Create destination directory prior to direct S3 downloads, if it doesn't
already exist ([#562](https://github.com/nsidc/earthaccess/issues/562))
+* Fix broken image link in sea level rise tutorial
+ (([#427](https://github.com/nsidc/earthaccess/issues/427))
## [v0.9.0] 2024-02-28
diff --git a/docs/contributing/maintainers-guide.md b/docs/contributing/maintainers-guide.md
index 1ac2fc79..2d54a354 100644
--- a/docs/contributing/maintainers-guide.md
+++ b/docs/contributing/maintainers-guide.md
@@ -53,6 +53,3 @@ The GitHub Actions CI services handle the project's building, testing, and manag
## Continuous Documentation
[ReadTheDocs](https://readthedocs.org/projects/earthaccess/) is used to generate and host [our documentation website](https://earthaccess.readthedocs.io/) as well as the preview for documentation changes made in pull requests. This service uses a configuration file in the root of the project, `.readthedocs.yml`.
-
-
-
diff --git a/docs/tutorials/SSL.ipynb b/docs/tutorials/SSL.ipynb
index 75182d4b..265f1fd7 100644
--- a/docs/tutorials/SSL.ipynb
+++ b/docs/tutorials/SSL.ipynb
@@ -10,7 +10,7 @@
"\n",
"### This notebook is entirely based on Jinbo Wang's [tutorial](https://github.com/betolink/the-coding-club/blob/main/notebooks/Earthdata_webinar_20220727.ipynb)\n",
"\n",
- "\n",
+ "\n",
"\n",
"--- \n",
"\n",
@@ -157,11 +157,11 @@
" * not fully tested with Dask distributed\n",
"* Data is gridded\n",
" * xarray works better with homogeneous coordinates, working with swath data will be cumbersome.\n",
- "* Data is chunked using reasonable large sizes(1MB or more)\n",
+ "* Data is chunked using reasonable large sizes (1MB or more)\n",
" * If our files are chunked in small pieces the access time will be orders of magnitude bigger than just downloading the data and accessing it locally.\n",
" \n",
- "Opening a year of SSH (SEA_SURFACE_HEIGHT_ALT_GRIDS_L4_2SATS_5DAY_6THDEG_V_JPL1812) data (1.1 GB approx) can take up to 5 minutes streaming the data out of region(not in AWS)\n",
- "The reason for this is not that the data transfer is order of magintude slower but due the client libraries not fetching data concurrently and the metadata of the files in HDF is usually not consolidated like in Zaar, hence h5netcdf has to issue a lot of requests to get the info it needs.\n",
+ "Opening a year of SSH (SEA_SURFACE_HEIGHT_ALT_GRIDS_L4_2SATS_5DAY_6THDEG_V_JPL1812) data (1.1 GB approx) can take up to 5 minutes streaming the data out of region (not in AWS)\n",
+ "The reason for this is not that the data transfer is order of magnitude slower but due the client libraries not fetching data concurrently and the metadata of the files in HDF is usually not consolidated like in Zaar, hence h5netcdf has to issue a lot of requests to get the info it needs.\n",
"\n",
"> Note: we are looping through each year and getting the metadata for the first granule in May"
]
@@ -195,7 +195,7 @@
"id": "8b63ca2f-c94c-4d4a-a620-a086ee66137f",
"metadata": {},
"source": [
- "### What `earthaccess.open()` do?\n",
+ "### What does `earthaccess.open()` do?\n",
"\n",
"`earthaccess.open()` takes a list of results from `earthaccess.search_data()` or a list of URLs and creates a list of Python File-like objects that can be used in our code as if the remote files were local. When executed in AWS the file system used is [S3FS](https://github.com/fsspec/s3fs) when we open files outside of AWS we get a regular HTTPS file session. \n"
]