Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Model top pressure < 0 in cpld_bmark_p8 test write component output #2484

Open
LarissaReames-NOAA opened this issue Oct 29, 2024 · 12 comments
Open
Labels
bug Something isn't working

Comments

@LarissaReames-NOAA
Copy link
Collaborator

LarissaReames-NOAA commented Oct 29, 2024

Description

Model top pressure is less than 0 in regions near the North Pole (cubed sphere grid tile 3) in write component output from cpld_bmark_p8 regression test when using either grid_type="gaussian_grid" or "cubed_sphere_grid". You can check this by summing dpres over the depth of the model data and seeing that it's less than surface pressure. When quilting is turned off, data on the native FV3 grid is normal with sum(dpres) = pressfc + 1Pa . This causes issues if one tries to create ICs for another run using chgres_cube. ICs created from restart files from the same forecast at the same time are normal. Here's an example of that that unusable output looks like for an array (they all display this pattern) in out.atm.tile3.nc on the top model level:

image

Hacking in a fix in chgres_cube to force model top pressure > 0 solves this issue and the output looks normal.

I have looked at the baseline files with python (by comparing sum(dpres) to surface pressure at various points in this region) and see this same pattern even in the oldest baseline data we have on Hera. I have not looked at baseline data on other platforms.

To Reproduce:

  1. Run cpld_bmark_p8_intel or open at the baseline data for this RT with write component on and grid type of either "gaussian_grid" or "cubed_sphere_grid"
  2. Compare surface pressure to sum(dpres) at any point inside the offending region (for example, point 216,216 on tile 3 or point 37,917 on the gaussian grid).
  3. Note that sum(dpres)>pressfc

cc @GeorgeGayno-NOAA to stay in the loop

@LarissaReames-NOAA LarissaReames-NOAA added the bug Something isn't working label Oct 29, 2024
@LarissaReames-NOAA
Copy link
Collaborator Author

Tagging @DusanJovic-NOAA to get his eyes on this.

@DeniseWorthen
Copy link
Collaborator

@LarissaReames-NOAA I assume this also happens on Tile 6?

@yangfanglin
Copy link
Collaborator

yangfanglin commented Oct 29, 2024

Please note for FV3 non-hydrostatic core, sum(dpres) = pressfc + 1Pa is not always satisfied and is expected. This behavior had been discussed and communicated with GFDL in the past. When running CHGRES, layer pressures should be recomputed as P(k,i,j)=a(k)*pressfc(i,j) + b(k), where a(k) and b(k) are predetermined constants.

@LarissaReames-NOAA
Copy link
Collaborator Author

@LarissaReames-NOAA I assume this also happens on Tile 6?

You'd think so but not that I can tell.

@junwang-noaa
Copy link
Collaborator

For the result difference (cubed sphere vs Gaussian grid), the model is using bi-linear interpolation to compute the dpres from native grid to Gaussian grid, that might cause different sum(dpres).

@LarissaReames-NOAA
Copy link
Collaborator Author

For the result difference (cubed sphere vs Gaussian grid), the model is using bi-linear interpolation to compute the dpres from native grid to Gaussian grid, that might cause different sum(dpres).

Also to be clear, I found this issue in both Gaussian and cubed sphere write component data. So it's not directly tied to one grid type or another.

@DusanJovic-NOAA
Copy link
Collaborator

Can you use actual pressure, either 'hydrostatic pressure' (pfhy) for hydrostatic configuration or 'non-hydrostatic pressure' (pfnh) for non-hydrostatic configuration, instead of using 'dpres' and recomputing pressure from it?

@junwang-noaa
Copy link
Collaborator

Also to be clear, I found this issue in both Gaussian and cubed sphere write component data. So it's not directly tied to one grid type or another.

This is something new. The cubed sphere grid output files on write component should be very close to the history files written by fms. Small difference could be that the dycore writes out real8 while write grid component outputs real4. Another difference is that the surface pressure is converted to "sudo pressure" ( which is (ps/stndrd_atmos_ps)**(rdgas/gravstndrd_atmos_lapse) and converted back to surface pressure on write grid component (ps**(grav/(rdgasstndrd_atmos_lapse))*stndrd_atmos_ps). "sudo pressure" is used for the bilinear interpolation for Gaussian grid. It will be just purely conversion for native grid.

Can you provide your run directories with cubed sphere history files from write component and from fms? We can check what has been changed.

@LarissaReames-NOAA
Copy link
Collaborator Author

LarissaReames-NOAA commented Oct 30, 2024

Also to be clear, I found this issue in both Gaussian and cubed sphere write component data. So it's not directly tied to one grid type or another.

This is something new. The cubed sphere grid output files on write component should be very close to the history files written by fms. Small difference could be that the dycore writes out real8 while write grid component outputs real4. Another difference is that the surface pressure is converted to "sudo pressure" ( which is (ps/stndrd_atmos_ps)(rdgas/grav_stndrd_atmos_lapse) and converted back to surface pressure on write grid component (ps(grav/(rdgas_stndrd_atmos_lapse))*stndrd_atmos_ps). "sudo pressure" is used for the bilinear interpolation for Gaussian grid. It will be just purely conversion for native grid.

Can you provide your run directories with cubed sphere history files from write component and from fms? We can check what has been changed.

I've updated some of the description above to make this more clear. Here's the two directories and relevant files/points to check:

cubed_sphere_grid
directory: /scratch1/NCEPDEV/stmp2/Larissa.Reames/FV3_RT/rt_4001664/cpld_bmark_p8_intel/
file to examine: any of dynf00*.tile3.nc.
Example point with negative model top pressure: (218,218)
Data from this point: pressfc = 103645.914
dp text file : dp.txt
dp sum at point (218,218) : 103646.414 Pa

FMS native output
directory : /scratch1/NCEPDEV/stmp2/Larissa.Reames/FV3_RT/rt_32680/cpld_bmark_p8_intel/
file to examine : fv3_history.tile3.nc
Example point with negative model top pressure: (218,218)
Data from this point: pressfc = 103647.414 Pa
dp text file : dp2.txt
dp sum at point (218,218) : 103646.414 Pa

For both forecasts I'm using the default regression test settings aside from write component grid:

Forecast date: 00Z 1 April 2013
Forecast Length : 6 hours
output frequency : 3 hours

@GeorgeGayno-NOAA
Copy link
Contributor

Here is how chgres_cube computes the 3D pressure for the following input files:

Let me know if these methods could be improved.

@yangfanglin
Copy link
Collaborator

I checked my notes back in 2017 and found the following statements,

by SJ Lin: The surface pressure is the sum of delp(1:npz) plus ptop. Therefore, it is total mass -- hydrostatic pressure, and all condensates counted..... Because of the moisture changes within each Lagrangian layers after the physics, the pressure as computed by ak and bk are not exact (good approximation though).

and by Linjoing Zhou: the most accurate method to calculate pressure at the model levels is to use "delp", sum it from the model top. The same method applies to height at model levels, that is sum "delz" from the surface".

Therefore, what Jun pointed out is correct. We should not use ak and bk to recalculate pres(k).

@junwang-noaa
Copy link
Collaborator

junwang-noaa commented Oct 30, 2024

The inconsistency of Ps in the native grid output from write grid comp and from fms might be precision error is caused by the conversions of PS when interpolated through hypsometric equation.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

6 participants