Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Definition of output for 1950-control and 1950-historical simulations #2

Open
tsemmler05 opened this issue Sep 7, 2022 · 15 comments
Open

Comments

@tsemmler05
Copy link
Owner

In the 1950-spinup simulation we have high-frequency 2D and 3D OpenIFS output (various 6-hourly parameters) and low-frequency 2D and 3D FESOM2 output (monthly, for u, v, and w only yearly). This is o.k. for the spin-up simulation but the FESOM2 output needs to be refined for the 1950-control and 1950-historical simulations. At the moment the output performance on aleph for FESOM2 is poor (daily 2D and 3D FESOM2 output adds around 50% of computing time). We need to try to fix this. Alternatively, if this is not possible, we have the choice to stick with the low frequency FESOM2 output and rerun periods of interest at a later stage.

The FESOM2 output is output every second level in the 1950-spinup simulation. However, this might not be a good idea for the 1950-control and 1950-historical simulations, because especially the calculation of transports may be compromised. An alternative suggestion is to stick with the 3D monthly output for the entire depth of the ocean but do 3D daily output for only the uppermost 500 m (or another depth to be determined).

Regarding the OpenIFS output there is the choice to output in compressed format through XIOS to avoid quick filling of the disks.

@cfranzke72
Copy link

Is there a disadvantage to compressed OpenIFS output? Would we lose some information?

@tsemmler05
Copy link
Owner Author

No, as long as the compressed OpenIFS output also works without (or only little) delay when applied in the higher resolution set-up (TCO319) there is no disadvantage and no information loss.

@cfranzke72
Copy link

Okay, then we probably should do this.

@christian-stepanek
Copy link

Yes, I think one should watch the model performance careful when doing the compression of high resolution data. At very low resolution the compression somehow fit into waiting time for other things to finish. Lets hope this is also the case for very high resolution. Yes, the compression is lossless, and for some of the files that I had at hand (low resolution, again) I did not find a difference in time when processing them compressed vs. non-compressed. Again, this could be checked separately for very high resolution output, to avoid to just make work more difficult when processing the data for analysis.

@tsemmler05
Copy link
Owner Author

This is the output list that is available from the 1950-spinup run. In orange changes from the original desired output list are given (which I had to cut down for FESOM because of the output performance problems and which I slightly extended for OpenIFS).

awi.outputs_variables_iccp_spinup.xlsx

@tsemmler05
Copy link
Owner Author

This is the output list that I suggest for the 1950-control and 1950-historical simulations. In red changes compared to the 1950-spinup run are marked. Changes only exist for FESOM2, for OpenIFS I left everything the same. Please note that the plan is to output every FESOM2 level and not only every second FESOM2 level as in the 1950-spinup run. Please look at the output list carefully and comment.

awi.outputs_variables_iccp_controlhist.xlsx

@tsemmler05
Copy link
Owner Author

Qiang suggests the following monthly output:

  1. the energy diagnostics need to be switched on in the namelist (as monthly mean);
  2. we also realize that it would be better to save out tracer (a=T and S) fluxes, that is, a*(u,v,w), and a, u, v, w that are exactly used in the flux calculations. (monthly mean)
  3. It would be also useful to save out variables that can be used for study eddy wind/ice killing: \tau*(u,v); even more, \tau_io*(u,v), \tau_ao*(u,v). (monthly mean)

@tsemmler05
Copy link
Owner Author

Axel suggests:

Vertical viscosity not needed
Monthly output of meridional stream function. @koldunovn: is it easy to activate this in FESOM2?

@koldunovn
Copy link

@tsemmler05 We need w (and only w:)) to compute MOC/AMOC and so on. We don't compute MOC in the model. But we can save w with monthly frequency.

@tsemmler05
Copy link
Owner Author

@koldunovn : how can I specify u in the uppermost 500 levels, just use as parameter in namelist.io u_0_500? And how can I specify the products that Qiang suggests in namelist.io?

@tsemmler05
Copy link
Owner Author

Tests regarding compressed OpenIFS output: 1 year with compressed OpenIFS output level 6: 07 hours 30 minutes; 1 year without compressed OpenIFS output level 6: 03 hours 35 minutes. Too much of a delay. Reduction of disk usage: 868 GB to 472 GB. Suggest to do compression as a postprocessing step.

@christian-stepanek
Copy link

Wow, that it slows down the simulation so much is astonishing - but you have very high resolution. How do you plan to do the compressing during postprocessing? A simple cdo -f nc4c -z zip_6 copy uncompressed.nc compressed .nc should work, but this can really take some time. Good thing is that if one tests how long the compressing takes one can ideally submit a lot of compression tasks in parallel in order to keep up with the actual data production.

@koldunovn
Copy link

@koldunovn : how can I specify u in the uppermost 500 levels, just use as parameter in namelist.io u_0_500? And how can I specify the products that Qiang suggests in namelist.io?

For this you have to define new variable in io_meandata.F90, there is no such a variable in standard output.

@koldunovn
Copy link

Wow, that it slows down the simulation so much is astonishing - but you have very high resolution. How do you plan to do the compressing during postprocessing? A simple cdo -f nc4c -z zip_6 copy uncompressed.nc compressed .nc should work, but this can really take some time. Good thing is that if one tests how long the compressing takes one can ideally submit a lot of compression tasks in parallel in order to keep up with the actual data production.

One should see which compression level to take. Usually most of the compression is done with level 1, and going to higher level slows things down considerably, without actual benefit. At leas this is true for FESOM data :)

@tsemmler05
Copy link
Owner Author

tsemmler05 commented Sep 12, 2022

@koldunovn : how can I specify u in the uppermost 500 levels, just use as parameter in namelist.io u_0_500? And how can I specify the products that Qiang suggests in namelist.io?

For this you have to define new variable in io_meandata.F90, there is no such a variable in standard output.

Good, and I see that products such as for example Tu, Su, S*v are already defined if ldiag_energy is activated.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants