-
Notifications
You must be signed in to change notification settings - Fork 13
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hyperspectral 2018 Level_1 data generation #591
Comments
Moving this to a separate comment because it addresses the specific question
Yes! The NCO command for subsetting the file to the first 939 wavelengths uses the ncks function ncks -O -d wavelength,0,938 cst_cnv_trg.nc foo.nc This creates a new file called ‘foo.nc’ that contains only 939 wavelengths. To explain each part:
To check that the output file is of the correct dimensions, dump the meatdata header and print just the dimension defintions: ncdump -h foo.nc | grep -A 5 dimensions Dump out the actual values of wavelength along the wavelength dimension:
note that the wavelength variable is both a dimension with integer indices 0:938 (one dimension of the hyperspectral data cube) and a variable with values and units of wavelengths in nm divided into 939 intervals in the range [389,1000]. |
Added the following code:
...however it didn't seem to resolve error:
will need someone's help with this. |
SWIR 2017 error:
|
SWIR 2018 error:
|
https://github.com/terraref/extractors-hyperspectral/pull/50/files new PR here that is still being finalized, result of our first two hackathons. Should be able to support VNIR new camera, SWIR new + old cameras. |
We need Level_1 outputs from 2018 using existing code to recalibrate the new algorithm.
It seems the cst_cnv_trg.nc file has trouble with the new band count - the new algorithm doesn't use this file but we need to use it with existing pipeline on 2018 data then recalibrate based on that output.
Is it possible to avoid relying on this file, or generating an updated version w/ different number of bands?
The text was updated successfully, but these errors were encountered: