Skip to content

Commit

Permalink
Version 1 2 1 2020 tcl update (#12)
Browse files Browse the repository at this point in the history
* Starting to work on updating flux model with 2020 tree cover loss. First, updating burned area.

* Fixing command for uploading hdf to s3.

* Trying again from step 2 of burned area.

* Trying again from step 2 of burned area.

* Updated c++ scripts for 2020 TCL and made some other 2020 updates. Ran into some issue at the end of step 4 of burned area, so trying that again.

* Ran individual model steps locally for test tile 00N_110E. Now going to try to change emissions c++ to use a constants header file.

* Ran individual model steps locally for test tile 00N_110E. Added constants.h for emissions model, so standard and sensitivity analyses get their constants and file patterns from the same sources. Easier for updating sensitivity analyses that way. Couldn't figure out how to make equations.cpp also used constants.h, so the model_years in equations.cpp needs to be updated separately for now. Using constants.h compiles and runs locally.

* Ran individual model steps locally for test tile 00N_110E. Added constants.h for emissions model, so standard and sensitivity analyses get their constants and file patterns from the same sources. Easier for updating sensitivity analyses that way. Couldn't figure out how to make equations.cpp also used constants.h, so the model_years in equations.cpp needs to be updated separately for now. Using constants.h compiles and runs locally.

* Ran individual model steps locally for test tile 00N_110E. Added constants.h for emissions model, so standard and sensitivity analyses get their constants and file patterns from the same sources. Easier for updating sensitivity analyses that way. Couldn't figure out how to make equations.cpp also used constants.h, so the model_years in equations.cpp needs to be updated separately for now. Using constants.h compiles and runs locally.

* Ran individual model steps locally for test tile 00N_110E. Added constants.h for emissions model, so standard and sensitivity analyses get their constants and file patterns from the same sources. Easier for updating sensitivity analyses that way. Couldn't figure out how to make equations.cpp also used constants.h, so the model_years in equations.cpp needs to be updated separately for now. Using constants.h compiles and runs locally.

* Corrected the calls for checking for empty tiles in annual and gross removals script.

* Corrected the calls for checking for empty tiles in annual and gross removals script. Now, annual removals only uploads tiles with data. Gross removals should be able to skip tile_ids that don't have the necessary input tiles (gain year couny and annual removals).

* Corrected the calls for checking for empty tiles in annual and gross removals script. Now, annual removals only uploads tiles with data. Gross removals should be able to skip tile_ids that don't have the necessary input tiles (gain year couny and annual removals).

* Still trying to figure out whether only annual removal factor tiles that have data are being copied to s3.

* Fixing the tile data check function.

* Correcting issue with tiles not existing for carbon pool creation.

* Changing output folder dates.

* Revised readme. Model ran on test tiles through aggregation. Need to fix supplementary output creation script error.

* Revised readme. Model ran on test tiles through aggregation. Need to fix supplementary output creation script error.

* Revised readme. Model ran on test tiles through aggregation. Need to fix supplementary output creation script error.

* Final test of supplementary output creation step.

* Going to run the full flux model with 2020 TCL data as model v1.2.1. Running from model_extent to emission-year carbon pool creation with today's date and everything after (emissions onwards) with date 20219999 because I'll need to rerun those steps with the updated drivers model. Tested on a few test tiles but this is my first run on all tiles with the 2020 update.

* Going to run the full flux model with 2020 TCL data as model v1.2.1. Running from model_extent to emission-year carbon pool creation with today's date and everything after (emissions onwards) with date 20219999 because I'll need to rerun those steps with the updated drivers model. Tested on a few test tiles but this is my first run on all tiles with the 2020 update.

* I've gotten errors a few times during full model runs when the script is trying to upload model logs. It seems to happen towards the end of a tile list and in stages that use more processors (though that could be coincidental). It seems to happen only sometimes, because I ran the same script a few times and sometimes the error happened and sometimes it didn't. My guess is that sometimes different log uploads compete with each other and that causes an error.
So I'm reducing the calls for uploading the log by removing uploads when normal statements are printed (in print_log). Instead, I added upload_log() commands at the end of each tile being processed (in end_of_fx_summary), at the end of each model stage (in upload_final_set), and at the very end of the model (last line of run_full_model). Hopefully this'll reduce the conflict between log uploads. Exceptions and subprocess commands still automatically trigger log uploads.

* Various small fixes to model. Should run smoothly from model_extent until create_supplementary_outputs. Ran into error with gross removals tile not existing on that stage.

* Fixing the supplementary outputs step. Issue with what tile_id_list to use.

* Jimmy MacCarthy updated the TCL 2020 drivers, so ready to process the 2020 drivers and run emissions, net flux, and their aggregation/supplementary outputs for real now.

* Jimmy MacCarthy updated the TCL 2020 drivers, so ready to process the 2020 drivers and run emissions, net flux, and their aggregation/supplementary outputs for real now.

* Jimmy MacCarthy updated the TCL 2020 drivers, so ready to process the 2020 drivers and run emissions, net flux, and their aggregation/supplementary outputs for real now.

* Jimmy MacCarthy updated the TCL 2020 drivers, so ready to process the 2020 drivers and run emissions, net flux, and their aggregation/supplementary outputs for real now.

* Jimmy MacCarthy updated the TCL 2020 drivers, so ready to process the 2020 drivers and run emissions, net flux, and their aggregation/supplementary outputs for real now.

* Jimmy MacCarthy updated the TCL 2020 drivers, so ready to process the 2020 drivers and run emissions, net flux, and their aggregation/supplementary outputs for real now.

* Jimmy MacCarthy updated the TCL 2020 drivers, so ready to process the 2020 drivers and run emissions, net flux, and their aggregation/supplementary outputs for real now.

* Jimmy MacCarthy updated the TCL 2020 drivers, so ready to process the 2020 drivers and run emissions, net flux, and their aggregation/supplementary outputs for real now.

* Jimmy MacCarthy updated the TCL 2020 drivers, so ready to process the 2020 drivers and run emissions, net flux, and their aggregation/supplementary outputs for real now.

* Jimmy MacCarthy updated the TCL 2020 drivers, so ready to process the 2020 drivers and run emissions, net flux, and their aggregation/supplementary outputs for real now.

* Jimmy MacCarthy updated the TCL 2020 drivers, so ready to process the 2020 drivers and run emissions, net flux, and their aggregation/supplementary outputs for real now.

* During emissions metadata creation, the log upload function got overwhelmed. Removed the log upload call in the subprocess.check_call function.

* Need to rerun emissions onwards because I have corrected 2020 TCL driver map now (previous run used an erroneous driver map). This should be the final run of the 2020 emissions model (v1.2.1).

* Need to rerun emissions onwards because I have corrected 2020 TCL driver map now (previous run used an erroneous driver map). This should be the final run of the 2020 emissions model (v1.2.1).

* Need to rerun emissions onwards because I have corrected 2020 TCL driver map now (previous run used an erroneous driver map). This should be the final run of the 2020 emissions model (v1.2.1).

* Still got an error from uploading logs too quickly to s3. So, removed log upload call from the subprocess call.

* For emissions and final steps using the final, corrected drivers for flux model v1.2.1 (2001-2020 update).

* Now with the correctly reprojected driver map.

* Ran into issue with deleting extra tiles after creating supplementary outputs.

* Successfully ran main flux model with biomass_soil emissions, going to run soil_only gross emissions for model v1.2.1 (2001-2020) now.

* Successfully ran main flux model with biomass_soil emissions, going to run soil_only gross emissions for model v1.2.1 (2001-2020) now.

* Successfully ran main flux model with biomass_soil emissions, going to run soil_only gross emissions for model v1.2.1 (2001-2020) now. Corrected the output soil_only emissions tile names.

* Successfully ran main flux model with biomass_soil emissions, going to run soil_only gross emissions for model v1.2.1 (2001-2020) now. Corrected the output soil_only emissions tile names... needed one final correction.
  • Loading branch information
dagibbs22 authored Mar 29, 2021
1 parent a51a51c commit 28fc2b4
Show file tree
Hide file tree
Showing 34 changed files with 1,226 additions and 775 deletions.
5 changes: 4 additions & 1 deletion Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ ENV SECRETS_PATH /usr/secrets
# set timezone fo tzdata
RUN ln -fs /usr/share/zoneinfo/America/New_York /etc/localtime

# Install missing dependencies
# Install dependencies
RUN apt-get update -y && apt-get install -y \
make \
automake \
Expand Down Expand Up @@ -53,6 +53,9 @@ RUN cd /usr/include && ln -s ./ gdal
#https://www.continualintegration.com/miscellaneous-articles/all/how-do-you-troubleshoot-usr-bin-env-python-no-such-file-or-directory/
RUN ln -s /usr/bin/python3 /usr/bin/python

# Enable ec2 to interact with GitHub
RUN git config --global user.email [email protected]

## Check out the branch that I'm currently using for model development
#RUN git checkout model_v_1.2.0
#
Expand Down
6 changes: 3 additions & 3 deletions analyses/mp_aggregate_results_to_4_km.py
Original file line number Diff line number Diff line change
Expand Up @@ -44,8 +44,8 @@ def mp_aggregate_results_to_4_km(sensit_type, thresh, tile_id_list, std_net_flux

# Files to download for this script
download_dict = {
cn.annual_gain_AGC_all_types_dir: [cn.pattern_annual_gain_AGC_all_types],
cn.cumul_gain_AGCO2_BGCO2_all_types_dir: [cn.pattern_cumul_gain_AGCO2_BGCO2_all_types],
# cn.annual_gain_AGC_all_types_dir: [cn.pattern_annual_gain_AGC_all_types],
# cn.cumul_gain_AGCO2_BGCO2_all_types_dir: [cn.pattern_cumul_gain_AGCO2_BGCO2_all_types],
cn.gross_emis_all_gases_all_drivers_biomass_soil_dir: [cn.pattern_gross_emis_all_gases_all_drivers_biomass_soil],
cn.net_flux_dir: [cn.pattern_net_flux]
}
Expand Down Expand Up @@ -219,7 +219,7 @@ def mp_aggregate_results_to_4_km(sensit_type, thresh, tile_id_list, std_net_flux

for tile_name in tile_list:
tile_id = uu.get_tile_id(tile_name)
os.remove('{0}_{1}.tif'.format(tile_id, pattern))
# os.remove('{0}_{1}.tif'.format(tile_id, pattern))
os.remove('{0}_{1}_rewindow.tif'.format(tile_id, pattern))
os.remove('{0}_{1}_0_4deg.tif'.format(tile_id, pattern))

Expand Down
8 changes: 5 additions & 3 deletions analyses/mp_create_supplementary_outputs.py
Original file line number Diff line number Diff line change
Expand Up @@ -32,8 +32,10 @@ def mp_create_supplementary_outputs(sensit_type, tile_id_list, run_date = None):

os.chdir(cn.docker_base_dir)

tile_id_list_outer = tile_id_list

# If a full model run is specified, the correct set of tiles for the particular script is listed
if tile_id_list == 'all':
if tile_id_list_outer == 'all':
# List of tiles to run in the model
tile_id_list_outer = uu.tile_list_s3(cn.net_flux_dir, sensit_type)

Expand All @@ -43,7 +45,7 @@ def mp_create_supplementary_outputs(sensit_type, tile_id_list, run_date = None):

# Files to download for this script
download_dict = {
cn.cumul_gain_AGCO2_BGCO2_all_types_dir: [cn.pattern_cumul_gain_AGCO2_BGCO2_all_types],
# cn.cumul_gain_AGCO2_BGCO2_all_types_dir: [cn.pattern_cumul_gain_AGCO2_BGCO2_all_types],
cn.gross_emis_all_gases_all_drivers_biomass_soil_dir: [cn.pattern_gross_emis_all_gases_all_drivers_biomass_soil],
cn.net_flux_dir: [cn.pattern_net_flux]
}
Expand Down Expand Up @@ -107,7 +109,7 @@ def mp_create_supplementary_outputs(sensit_type, tile_id_list, run_date = None):
# List of tiles to run in the model
tile_id_list_input = uu.tile_list_s3(input_dir, sensit_type)
else:
tile_id_list_input = tile_id_list
tile_id_list_input = tile_id_list_outer

uu.print_log(tile_id_list_input)
uu.print_log("There are {} tiles to process".format(str(len(tile_id_list_input))) + "\n")
Expand Down
8 changes: 4 additions & 4 deletions analyses/net_flux.py
Original file line number Diff line number Diff line change
Expand Up @@ -29,19 +29,19 @@ def net_calc(tile_id, pattern, sensit_type):
kwargs = removals_src.meta
# Grabs the windows of the tile (stripes) so we can iterate over the entire tif without running out of memory
windows = removals_src.block_windows(1)
uu.print_log(" Gross removals tile {} found".format(removals_in))
uu.print_log(" Gross removals tile found for {}".format(removals_in))
except:
uu.print_log(" No gross removals tile {} found".format(removals_in))
uu.print_log(" No gross removals tile found for {}".format(removals_in))

try:
emissions_src = rasterio.open(emissions_in)
# Grabs metadata about the tif, like its location/projection/cellsize
kwargs = emissions_src.meta
# Grabs the windows of the tile (stripes) so we can iterate over the entire tif without running out of memory
windows = emissions_src.block_windows(1)
uu.print_log(" Gross emissions tile {} found".format(emissions_in))
uu.print_log(" Gross emissions tile found for {}".format(emissions_in))
except:
uu.print_log(" No gross emissions tile {} found".format(emissions_in))
uu.print_log(" No gross emissions tile found for {}".format(emissions_in))

# Skips the tile if there is neither a gross emissions nor a gross removals tile.
# This should only occur for biomass_swap sensitivity analysis, which gets its net flux tile list from
Expand Down
4 changes: 2 additions & 2 deletions burn_date/hansen_burnyear_final.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ def hansen_burnyear(tile_id):
# once metadata tags have been added.
out_tile_no_tag = '{0}_{1}_no_tag.tif'.format(tile_id, cn.pattern_burn_year)
out_tile = '{0}_{1}.tif'.format(tile_id, cn.pattern_burn_year)
loss = '{0}_{1}.tif'.format(cn.pattern_loss, tile_id)
loss = '{0}.tif'.format(tile_id)

# Does not continue processing tile if no loss (because there will not be any output)
if not os.path.exists(loss):
Expand Down Expand Up @@ -145,7 +145,7 @@ def hansen_burnyear(tile_id):
out_tile_tagged.update_tags(
units='year (2001, 2002, 2003...)')
out_tile_tagged.update_tags(
source='MODIS collection 6 burned area')
source='MODIS collection 6 burned area, https://modis-fire.umd.edu/files/MODIS_C6_BA_User_Guide_1.3.pdf')
out_tile_tagged.update_tags(
extent='global')

Expand Down
87 changes: 52 additions & 35 deletions burn_date/mp_burn_year.py
Original file line number Diff line number Diff line change
@@ -1,28 +1,23 @@
'''
Creates tiles of when tree cover loss coincides with burning.
There are kind of four steps to this: 1) acquire raw hdfs from MODIS burned area ftp; 2) make tifs of burned area for
each year in each MODUS h-v tile; 3) make annual Hansen tiles of burned area; 4) make tiles of where TCL and burning
coincided (same year or with 1 year lag).
Creates tiles of when tree cover loss coincides with burning or preceded burning by one year.
There are four steps to this: 1) acquire raw hdfs from MODIS burned area sftp; 2) make tifs of burned area for
each year in each MODUS h-v tile; 3) make annual Hansen-style (extent, res, etc.) tiles of burned area;
4) make tiles of where TCL and burning coincided (same year or with 1 year lag).
To update this, steps 1-3 can be run on only the latest year of MODIS burned area product. Only step 4 needs to be run
on the entire time series. That is, steps 1-3 operate on burned area products separately for each year, so adding
another year of data won't change steps 1-3 for preceding years.
When I ran this for the model v1.2.0 update, I ran it step by step, so I've never run this all in one go and don't know
if there are issues with doing that (storage, path names, etc.). However, any issues like that should be easy enough
to fix now that this is consolidated into one master script.
Step 4 takes many hours to run, mostly because it only uses five processors since each one requires so much memory.
The other three steps can also take a few hours, I believe. Point is-- updating burned area takes a while.
This is still basically as Sam Gibbes wrote it in early 2018, with file name changes and other cosmetic changes
by David Gibbs. The real processing code is still all by Sam.
NOTE: The step in which hdf files are downloaded from the MODIS burned area site using wget (step 1) requires
NOTE: The step in which hdf files are opened and converted to tifs (step 2) requires
osgeo/gdal:ubuntu-full-X.X.X Docker image. The "small' Docker image doesn't have an hdf driver in gdal, so it can't read
the hdf files on the ftp site. The rest of the burned area analysis can be done with a 'small' version of the Docker image
(though that would require terminating the Docker container and restarting it, which would only make sense if the
analysis was being continued later).
Step 4 takes many hours to run, mostly because it only uses five processors since each one requires so much memory.
The other steps might take an hour or two to run.
This is still basically as Sam Gibbes wrote it in early 2018, with file name changes and other input/output changes
by David Gibbs. The real processing code is still all by Sam's parts.
'''

import multiprocessing
Expand Down Expand Up @@ -61,19 +56,10 @@ def mp_burn_year(tile_id_list, run_date = None):
output_dir_list = [cn.burn_year_dir]
output_pattern_list = [cn.pattern_burn_year]

# Step 1:
# Downloads the latest year of raw burn area hdfs to the spot machine.
# This step requires using osgeo/gdal:ubuntu-full-X.X.X Docker image because the small image doesn't have an
# hdf driver in gdal.
file_name = "*.hdf"
raw_source = '{0}/20{1}'.format(cn.burn_area_raw_ftp, cn.loss_years)
cmd = ['wget', '-r', '--ftp-user=user', '--ftp-password=burnt_data', '--accept', file_name]
cmd += ['--no-directories', '--no-parent', raw_source]
uu.log_subprocess_output_full(cmd)

# Uploads the latest year of raw burn area hdfs to s3
cmd = ['aws', 's3', 'cp', '.', cn.burn_year_hdf_raw_dir, '--recursive', '--exclude', '*', '--include', '*hdf']
uu.log_subprocess_output_full(cmd)
# A date can optionally be provided.
# This replaces the date in constants_and_names.
if run_date is not None:
output_dir_list = uu.replace_output_dir_date(output_dir_list, run_date)

global_grid_hv = ["h00v08", "h00v09", "h00v10", "h01v07", "h01v08", "h01v09", "h01v10", "h01v11", "h02v06",
"h02v08", "h02v09", "h02v10", "h02v11", "h03v06", "h03v07", "h03v09", "h03v10", "h03v11",
Expand Down Expand Up @@ -106,8 +92,34 @@ def mp_burn_year(tile_id_list, run_date = None):
"h32v11", "h32v12", "h33v07", "h33v08", "h33v09", "h33v10", "h33v11", "h34v07", "h34v08",
"h34v09", "h34v10", "h35v08", "h35v09", "h35v10"]


# Step 1: download hdf files for relevant year(s) from sftp site.
# This only needs to be done for the most recent year of data.

'''
Downloading the hdf files from the sftp burned area site is done outside the script in the sftp shell on the command line.
This will download all the 2020 hdfs to the spot machine. It will take a few minutes before the first
hdf is downloaded but then it should go quickly.
Change 2020 to other year for future years of downloads.
https://modis-fire.umd.edu/files/MODIS_C6_BA_User_Guide_1.3.pdf, page 24, section 4.1.3
sftp [email protected]
[For password] burnt
cd data/MODIS/C6/MCD64A1/HDF
ls [to check that it's the folder with all the tile folders]
get h??v??/MCD64A1.A2020*
bye //exits the stfp shell
'''

# Uploads the latest year of raw burn area hdfs to s3.
# All hdfs go in this folder
cmd = ['aws', 's3', 'cp', '{0}/burn_date/'.format(cn.docker_app), cn.burn_year_hdf_raw_dir, '--recursive', '--exclude', '*', '--include', '*hdf']
uu.log_subprocess_output_full(cmd)


# Step 2:
# Makes burned area rasters for each year for each MODIS horizontal-vertical tile
# Makes burned area rasters for each year for each MODIS horizontal-vertical tile.
# This only needs to be done for the most recent year of data (set in stach_ba_hv).
uu.print_log("Stacking hdf into MODIS burned area tifs by year and MODIS hv tile...")

count = multiprocessing.cpu_count()
Expand All @@ -116,17 +128,20 @@ def mp_burn_year(tile_id_list, run_date = None):
pool.close()
pool.join()

# For single processor use
for hv_tile in global_grid_hv:
stack_ba_hv.stack_ba_hv(hv_tile)
# # For single processor use
# for hv_tile in global_grid_hv:
# stack_ba_hv.stack_ba_hv(hv_tile)


# Step 3:
# Creates a 10x10 degree wgs 84 tile of .00025 res burned year.
# Downloads all MODIS hv tiles from s3,
# makes a mosaic for each year, and warps to Hansen extent.
# Range is inclusive at lower end and exclusive at upper end (e.g., 2001, 2020 goes from 2001 to 2019)
for year in range(2019, 2020):
# Range is inclusive at lower end and exclusive at upper end (e.g., 2001, 2021 goes from 2001 to 2020).
# This only needs to be done for the most recent year of data.
# NOTE: The first time I ran this for the 2020 TCL update, I got an error about uploading the log to s3
# after most of the tiles were processed. I didn't know why it happened, so I reran the step and it went fine.
for year in range(2020, 2021):

uu.print_log("Processing", year)

Expand Down Expand Up @@ -188,13 +203,15 @@ def mp_burn_year(tile_id_list, run_date = None):

# Step 4:
# Creates a single Hansen tile covering all years that represents where burning coincided with tree cover loss
# or preceded TCL by one year.
# This needs to be done on all years each time burned area is updated.

# Downloads the loss tiles
uu.s3_folder_download(cn.loss_dir, '.', 'std', cn.pattern_loss)

uu.print_log("Extracting burn year data that coincides with tree cover loss...")

# Downloads the 10x10 deg burn year tiles (1 for each year in which there was burned areaa), stack and evaluate
# Downloads the 10x10 deg burn year tiles (1 for each year in which there was burned area), stack and evaluate
# to return burn year values on hansen loss pixels within 1 year of loss date
if cn.count == 96:
processes = 5
Expand Down
3 changes: 1 addition & 2 deletions burn_date/stack_ba_hv.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@

def stack_ba_hv(hv_tile):

for year in range(2019, 2020): # End year is not included in burn year product
for year in range(2020, 2021): # End year is not included in burn year product

# Download hdf files from s3 into folders by h and v
output_dir = utilities.makedir('{0}/{1}/raw/'.format(hv_tile, year))
Expand All @@ -23,7 +23,6 @@ def stack_ba_hv(hv_tile):
if len(hdf_files) > 0:
array_list = []
for hdf in hdf_files:
uu.print_log("converting hdf to array")
array = utilities.hdf_to_array(hdf)
array_list.append(array)

Expand Down
2 changes: 1 addition & 1 deletion burn_date/utilities.py
Original file line number Diff line number Diff line change
Expand Up @@ -130,7 +130,7 @@ def makedir(dir):


def download_df(year, hv_tile, output_dir):
include = '*A{0}*{1}*'.format(year, hv_tile)
include = 'MCD64A1.A{0}*{1}*'.format(year, hv_tile)
cmd = ['aws', 's3', 'cp', cn.burn_year_hdf_raw_dir, output_dir, '--recursive', '--exclude',
"*", '--include', include]

Expand Down
23 changes: 11 additions & 12 deletions carbon_pools/create_carbon_pools.py
Original file line number Diff line number Diff line change
Expand Up @@ -70,10 +70,10 @@ def create_AGC(tile_id, sensit_type, carbon_pool_extent):
loss_year = '{}_{}.tif'.format(tile_id, cn.pattern_Mekong_loss_processed)
else:
uu.print_log(" Hansen loss tile found for {}".format(tile_id))
loss_year = '{0}_{1}.tif'.format(cn.pattern_loss, tile_id)
loss_year = '{0}.tif'.format(tile_id)

# This input should exist
removal_forest_type_src = rasterio.open(removal_forest_type)
# This input is required to exist
loss_year_src = rasterio.open(loss_year)

# Opens the input tiles if they exist
try:
Expand Down Expand Up @@ -107,17 +107,17 @@ def create_AGC(tile_id, sensit_type, carbon_pool_extent):
uu.print_log(" No gain tile found for", tile_id)

try:
loss_year_src = rasterio.open(loss_year)
uu.print_log(" Loss tile found for", tile_id)
removal_forest_type_src = rasterio.open(removal_forest_type)
uu.print_log(" Removal type tile found for", tile_id)
except:
uu.print_log(" No loss tile found for", tile_id)
uu.print_log(" No removal type tile found for", tile_id)


# Grabs the windows of a tile to iterate over the entire tif without running out of memory
windows = removal_forest_type_src.block_windows(1)
windows = loss_year_src.block_windows(1)

# Grabs metadata for one of the input tiles, like its location/projection/cellsize
kwargs = removal_forest_type_src.meta
kwargs = loss_year_src.meta

# Updates kwargs for the output dataset.
# Need to update data type to float 32 so that it can handle fractional carbon
Expand All @@ -129,7 +129,6 @@ def create_AGC(tile_id, sensit_type, carbon_pool_extent):
dtype='float32'
)


# The output files: aboveground carbon density in 2000 and in the year of loss. Creates names and rasters to write to.
if '2000' in carbon_pool_extent:
output_pattern_list = [cn.pattern_AGC_2000]
Expand Down Expand Up @@ -167,7 +166,7 @@ def create_AGC(tile_id, sensit_type, carbon_pool_extent):
for idx, window in windows:

# Reads the input tiles' windows. For windows from tiles that may not exist, an array of all 0s is created.
removal_forest_type_window = removal_forest_type_src.read(1, window=window)
loss_year_window = loss_year_src.read(1, window=window)
try:
annual_gain_AGC_window = annual_gain_AGC_src.read(1, window=window)
except:
Expand All @@ -177,9 +176,9 @@ def create_AGC(tile_id, sensit_type, carbon_pool_extent):
except:
cumul_gain_AGCO2_window = np.zeros((window.height, window.width), dtype='float32')
try:
loss_year_window = loss_year_src.read(1, window=window)
removal_forest_type_window = removal_forest_type_src.read(1, window=window)
except:
loss_year_window = np.zeros((window.height, window.width), dtype='uint8')
removal_forest_type_window = np.zeros((window.height, window.width), dtype='uint8')
try:
gain_window = gain_src.read(1, window=window)
except:
Expand Down
Loading

0 comments on commit 28fc2b4

Please sign in to comment.