Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Updates to re-pull code #85

Merged
merged 1 commit into from
May 16, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,9 @@ reports/images/*
#all tmp folders
*/tmp/*

# pptx files (too big!)
*.pptx

# Local only folders
data_local/*

Expand Down
13 changes: 10 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,9 +1,11 @@
# Superior-Plume-Bloom

Description: code to rasterize presence and absence of sediment plumes and algal blooms over time for the western area of Lake Superior
Description: code to rasterize presence and absence of sediment plumes and algal
blooms over time for the western area of Lake Superior
Contact: B Steele (B dot Steele at colostate dot edu)

This repository is covered by the MIT use license. We request that all downstream uses of this work be available to the public when possible.
This repository is covered by the MIT use license. We request that all downstream
uses of this work be available to the public when possible.

Note, there are two methods of authentication used for the Earth Engine workflows.
A major authentication changed in December 2023 v0.1.383, where authentication
Expand All @@ -12,6 +14,9 @@ was completed using `earthengine authenticate` at the command line. If you are
attempting to reproduce the code here, but have a version >= v0.1.383, you will
need to use `ee.Authenticate()` instead of the command line `earthengine authenticate`.

The Methods_Results_Summary.Rmd and associated .html summarize the methodology
and results of the modeling effort.

## Folder Descriptions

*availability_checks*: scripts to document image availability for our AOI
Expand All @@ -22,8 +27,10 @@ need to use `ee.Authenticate()` instead of the command line `earthengine authent

*eePlumB*: scripts to develop the 'Earth Engine Plume and Bloom' (eePlumB) labeling modules

*modeling*: scripts to apply supervised kmeans and create output from models
*modeling*: scripts to work through label data, apply GTB models in GEE, and
summarize output

*reports*: Rmd files for report generation



3 changes: 3 additions & 0 deletions Superior-Plume-Bloom.Rproj
Original file line number Diff line number Diff line change
Expand Up @@ -11,3 +11,6 @@ Encoding: UTF-8

RnwWeave: Sweave
LaTeX: pdfLaTeX

MarkdownWrap: Column
MarkdownWrapAtColumn: 80
8,196 changes: 4,098 additions & 4,098 deletions data/labels/collated_label_data_v2023-07-20.csv

Large diffs are not rendered by default.

62 changes: 45 additions & 17 deletions eePlumB/2_data_ingestion/2_data_re_pull_LS5.Rmd
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,9 @@ output: html_document

# Purpose

This script uses the volunteer point locations exported from `1_data_download.Rmd` which have been manually uploaded as a feature collection to the `ee-ross-superior` project in Earth Engine.
This script uses the volunteer point locations exported from
`1_data_download.Rmd` which have been manually uploaded as a feature collection
to the `ee-ross-superior` project in Earth Engine.

## R/Python Setup

Expand All @@ -28,7 +30,8 @@ package_loader <- function(x) {
invisible(lapply(libs, package_loader))
```

Use the conda environment if the env folder is present, otherwise create the environment.
Use the conda environment if the env folder is present, otherwise create the
environment.

```{r conda-env}
if (!dir.exists("env")) {
Expand Down Expand Up @@ -60,7 +63,8 @@ labels = ee.FeatureCollection("projects/ee-ross-superior/assets/labels/collated_
labels_5 = labels.filter(ee.Filter.eq("mission", "LS5"))
```

Get the unique dates of images from this list and assign the date to the `system:time_start` parameter.
Get the unique dates of images from this list and assign the date to the
`system:time_start` parameter.

```{python}
dates_5 = labels_5.aggregate_array("date").distinct()
Expand Down Expand Up @@ -90,7 +94,7 @@ export_l5_meta = (ee.batch.Export.table.toDrive(
fileNamePrefix = 'LS5_image_metadata',
fileFormat = 'csv'))
export_l5_meta.start()
# export_l5_meta.start()
```

Expand All @@ -100,6 +104,7 @@ And then apply the scaling factors to the stack
l5 = (l5
.map(rp.applyScaleFactors)
.map(rp.apply_radsat_mask)
.map(rp.flag_qa_conf)
.map(rp.addImageDate))
```
Expand All @@ -113,24 +118,47 @@ for i in range(dates_5.length().getInfo()):
print(one_date.getInfo())
one_dt = ee.Date(one_date)
dt_label = labels_5_dt.filterDate(one_dt, one_dt.advance(1, 'day'))
one_image = l5.filterDate(one_dt, one_dt.advance(1, 'day')).mean()
one_image_SR_band = (l5
.filterDate(one_dt, one_dt.advance(1, 'day'))
.select(['SR_B1', 'SR_B2', 'SR_B3', 'SR_B4', 'SR_B5', 'SR_B7'])
.mean())
one_image_bits = (l5
.filterDate(one_dt, one_dt.advance(1, 'day'))
.select(['cirrus_conf', 'snowice_conf', 'cloudshad_conf',
'cloud_conf', 'dialated_cloud', 'SR_ATMOS_OPACITY'])
.max())
one_image_cdist = (l5
.filterDate(one_dt, one_dt.advance(1, 'day'))
.select('ST_CDIST')
.min())
#define bands to extract and reduce regions
data = (one_image
bandsOut = one_image_SR_band.addBands(one_image_bits).addBands(one_image_cdist)
combinedReducer = (ee.Reducer.median().unweighted()
.forEachBand(bandsOut.select(['SR_B1', 'SR_B2', 'SR_B3',
'SR_B4', 'SR_B5', 'SR_B7']))
.combine(ee.Reducer.max().unweighted()
.forEachBand(bandsOut.select(['cirrus_conf', 'snowice_conf', 'cloudshad_conf',
'cloud_conf', 'dialated_cloud', 'SR_ATMOS_OPACITY'])), sharedInputs = False)
.combine(ee.Reducer.min().unweighted()
.forEachBand(bandsOut.select('ST_CDIST')), sharedInputs = False)
)
# Collect median reflectance and occurance values
# Make a cloud score, and get the water pixel count
data = (bandsOut
.reduceRegions(
collection = dt_label,
reducer = ee.Reducer.median().forEachBand(one_image),
reducer = combinedReducer,
scale = 30,
tileScale = 2,
crs = one_image.geometry().projection().crs()
))
crs = one_image_SR_band.geometry().projection().crs()))
image_date_export = (ee.batch.Export.table.toDrive(
collection = data,
description = 'LS5_' + one_date.getInfo(),
folder = 'eePlumB_additional_band_data',
fileNamePrefix = 'LS5_' + one_date.getInfo() + '_additional_vars',
fileFormat = 'csv'))
collection = data,
description = 'LS5_' + one_date.getInfo(),
folder = 'eePlumB_additional_band_data',
fileNamePrefix = 'LS5_' + one_date.getInfo() + '_additional_vars_v2024-04-25',
fileFormat = 'csv'))
image_date_export.start()
Expand Down
43 changes: 34 additions & 9 deletions eePlumB/2_data_ingestion/3_data_re_pull_LS7.Rmd
Original file line number Diff line number Diff line change
Expand Up @@ -89,7 +89,7 @@ export_l7_meta = (ee.batch.Export.table.toDrive(
fileNamePrefix = 'LS7_image_metadata',
fileFormat = 'csv'))
export_l7_meta.start()
#export_l7_meta.start()
```

Expand All @@ -99,6 +99,7 @@ And then apply the scaling factors to the stack
l7 = (l7
.map(rp.applyScaleFactors)
.map(rp.apply_radsat_mask)
.map(rp.flag_qa_conf)
.map(rp.addImageDate))
```
Expand All @@ -112,23 +113,47 @@ for i in range(dates_7.length().getInfo()):
print(one_date.getInfo())
one_dt = ee.Date(one_date)
dt_label = labels_7_dt.filterDate(one_dt, one_dt.advance(1, 'day'))
one_image = l7.filterDate(one_dt, one_dt.advance(1, 'day')).mean()
one_image_SR_band = (l7
.filterDate(one_dt, one_dt.advance(1, 'day'))
.select(['SR_B1', 'SR_B2', 'SR_B3', 'SR_B4', 'SR_B5', 'SR_B7'])
.mean())
one_image_bits = (l7
.filterDate(one_dt, one_dt.advance(1, 'day'))
.select(['cirrus_conf', 'snowice_conf', 'cloudshad_conf',
'cloud_conf', 'dialated_cloud',
'SR_ATMOS_OPACITY'])
.max())
one_image_cdist = (l7
.filterDate(one_dt, one_dt.advance(1, 'day'))
.select('ST_CDIST')
.min())
#define bands to extract and reduce regions
data = (one_image
bandsOut = one_image_SR_band.addBands(one_image_bits).addBands(one_image_cdist)
combinedReducer = (ee.Reducer.median().unweighted()
.forEachBand(bandsOut.select(['SR_B1', 'SR_B2', 'SR_B3', 'SR_B4', 'SR_B5', 'SR_B7']))
.combine(ee.Reducer.max().unweighted()
.forEachBand(bandsOut.select(['cirrus_conf', 'snowice_conf', 'cloudshad_conf',
'cloud_conf', 'dialated_cloud', 'SR_ATMOS_OPACITY'])), sharedInputs = False)
.combine(ee.Reducer.min().unweighted()
.forEachBand(bandsOut.select('ST_CDIST')), sharedInputs = False)
)
# Collect median reflectance and occurance values
# Make a cloud score, and get the water pixel count
data = (bandsOut
.reduceRegions(
collection = dt_label,
reducer = ee.Reducer.median().forEachBand(one_image),
reducer = combinedReducer,
scale = 30,
tileScale = 2,
crs = one_image.geometry().projection().crs()
))
crs = one_image_SR_band.geometry().projection().crs()))
image_date_export = (ee.batch.Export.table.toDrive(
collection = data,
description = 'LS7_' + one_date.getInfo(),
folder = 'eePlumB_additional_band_data',
fileNamePrefix = 'LS7_' + one_date.getInfo() + '_additional_vars',
fileNamePrefix = 'LS7_' + one_date.getInfo() + '_additional_vars_v2024-04-25',
fileFormat = 'csv'))
image_date_export.start()
Expand Down
61 changes: 46 additions & 15 deletions eePlumB/2_data_ingestion/4_data_re_pull_LS8.Rmd
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,9 @@ output: html_document

# Purpose

This script uses the volunteer point locations exported from `1_data_download.Rmd` which have been manually uploaded as a feature collection to the `ee-ross-superior` project in Earth Engine.
This script uses the volunteer point locations exported from
`1_data_download.Rmd` which have been manually uploaded as a feature collection
to the `ee-ross-superior` project in Earth Engine.

## R/Python Setup

Expand All @@ -28,7 +30,8 @@ package_loader <- function(x) {
invisible(lapply(libs, package_loader))
```

Use the conda environment if the env folder is present, otherwise create the environment.
Use the conda environment if the env folder is present, otherwise create the
environment.

```{r conda-env}
if (!dir.exists("env")) {
Expand Down Expand Up @@ -60,7 +63,8 @@ labels = ee.FeatureCollection("projects/ee-ross-superior/assets/labels/collated_
labels_8 = labels.filter(ee.Filter.eq("mission", "LS8"))
```

Get the unique dates of images from this list and assign the date to the `system:time_start` parameter.
Get the unique dates of images from this list and assign the date to the
`system:time_start` parameter.

```{python}
dates_8 = labels_8.aggregate_array("date").distinct()
Expand Down Expand Up @@ -89,18 +93,21 @@ export_l8_meta = (ee.batch.Export.table.toDrive(
fileNamePrefix = 'LS8_image_metadata',
fileFormat = 'csv'))
export_l8_meta.start()
# export_l8_meta.start()
```

And then apply the scaling factors to the stack

And then apply the scaling factors to the stack

```{python}
l8 = (l8
.map(rp.applyScaleFactors89)
.map(rp.applyScaleFactors_89)
.map(rp.apply_radsat_mask)
.map(rp.addImageDate)
.map(rp.flag_cirrus_conf))
.map(rp.flag_qa_conf)
.map(rp.flag_high_aerosol)
.map(rp.addImageDate))
```

Expand All @@ -113,23 +120,47 @@ for i in range(dates_8.length().getInfo()):
print(one_date.getInfo())
one_dt = ee.Date(one_date)
dt_label = labels_8_dt.filterDate(one_dt, one_dt.advance(1, 'day'))
one_image = l8.filterDate(one_dt, one_dt.advance(1, 'day')).mean()
one_image_SR_band = (l8
.filterDate(one_dt, one_dt.advance(1, 'day'))
.select(['SR_B1', 'SR_B2', 'SR_B3', 'SR_B4', 'SR_B5', 'SR_B6', 'SR_B7'])
.mean())
one_image_bits = (l8
.filterDate(one_dt, one_dt.advance(1, 'day'))
.select(['cirrus_conf', 'snowice_conf', 'cloudshad_conf',
'cloud_conf', 'dialated_cloud', 'aero_level'])
.max())
one_image_cdist = (l8
.filterDate(one_dt, one_dt.advance(1, 'day'))
.select('ST_CDIST')
.min())
#define bands to extract and reduce regions
data = (one_image
bandsOut = one_image_SR_band.addBands(one_image_bits).addBands(one_image_cdist)
combinedReducer = (ee.Reducer.median().unweighted()
.forEachBand(bandsOut.select(['SR_B1', 'SR_B2', 'SR_B3',
'SR_B4', 'SR_B5', 'SR_B6', 'SR_B7']))
.combine(ee.Reducer.max().unweighted()
.forEachBand(bandsOut.select(['cirrus_conf', 'snowice_conf', 'cloudshad_conf',
'cloud_conf', 'dialated_cloud', 'aero_level'])), sharedInputs = False)
.combine(ee.Reducer.min().unweighted()
.forEachBand(bandsOut.select('ST_CDIST')), sharedInputs = False)
)
# Collect median reflectance and occurance values
# Make a cloud score, and get the water pixel count
data = (bandsOut
.reduceRegions(
collection = dt_label,
reducer = ee.Reducer.median().forEachBand(one_image),
reducer = combinedReducer,
scale = 30,
tileScale = 2,
crs = one_image.geometry().projection().crs()
))
crs = one_image_SR_band.geometry().projection().crs()))
image_date_export = (ee.batch.Export.table.toDrive(
collection = data,
description = 'LS8_' + one_date.getInfo(),
folder = 'eePlumB_additional_band_data',
fileNamePrefix = 'LS8_' + one_date.getInfo() + '_additional_vars',
fileNamePrefix = 'LS8_' + one_date.getInfo() + '_additional_vars_v2024-04-25',
fileFormat = 'csv'))
image_date_export.start()
Expand Down
Loading