diff --git a/CCPPtechnical/source/AddingNewSchemes.rst b/CCPPtechnical/source/AddingNewSchemes.rst index 35d317c..2c282d9 100644 --- a/CCPPtechnical/source/AddingNewSchemes.rst +++ b/CCPPtechnical/source/AddingNewSchemes.rst @@ -1,5 +1,5 @@ .. _AddNewSchemes: - + **************************************** Connecting a scheme to CCPP **************************************** @@ -8,19 +8,36 @@ This chapter contains a brief description on how to add a :term:`scheme` to the .. note:: The instructions in this chapter assume the user is implementing this scheme for use with the CCPP Single-Column model (:term:`SCM`); not only is the SCM more lightweight than a full 3D NWP model for development purposes, but using the SCM as a :term:`host model` is a requirement for all new CCPP schemes for testing purposes. For implementation in another host model, especially for adding new variables, some modifications to that host model's metadata may be required; see :numref:`Chapter %s ` for details +============================== +Criteria for inclusion in CCPP +============================== + +CCPP governance, including interests from NOAA, NCAR, and developers of existing schemes, have decided on the following criteria for including new schemes in the CCPP physics repository. +Because there is some subjectivity in these items, and requirements may change over time, we encourage developers of prospective CCPP schemes to reach out via `Github discussions `_ at an early stage. + +* The scheme must be sufficiently different from schemes already in the CCPP Physics repository. +* The scheme should be either + + * desired by an organization participating in the funding of CCPP or + * the scheme’s development and/or testing is a funded project of a CCPP-sponsor organization. + +* The scheme must be compiled/run with at least one CCPP-compliant host model, and pass that host model's regression tests. +* The scheme must be documented, ideally with references to published scientific results. +* The scheme must have developer support, or at least a point-of-contact for reviewing code changes. + ============================== Preparing a scheme for CCPP ============================== There are a few steps that can be taken to prepare a scheme for addition to CCPP prior to starting the process of implementing it in the CCPP Framework: -1. Remove/refactor any incompatible features described in :numref:`Section %s `. This includes updating Fortran code to at least Fortran 90 standards, removing STOP and GOTO statements, removing common blocks, and refactoring any other disallowed features. +1. Remove/refactor any incompatible features described in :numref:`Section %s `. This includes updating Fortran code to at least Fortran 90 standards, removing ``STOP`` and ``GOTO`` statements, removing common blocks, and refactoring any other disallowed features. 2. Make an inventory of all variables that are inputs and/or outputs to the scheme. Check the file ``ccpp-framework/doc/DevelopersGuide/CCPP_VARIABLES_SCM.pdf`` to see if each variable has already been implemented in the single column model. If there are variables that are not available, see :numref:`Section %s `. ============================= Implementing a scheme in CCPP ============================= -There are, broadly speaking, two approaches for connecting an existing physics scheme to the CCPP Framework: +There are, broadly speaking, two approaches for connecting an existing physics scheme to the CCPP Framework: 1. Refactor the existing scheme to CCPP format standards, using ``pre_`` and ``post_`` :term:`interstitial schemes ` to interface to and from the existing scheme if necessary. 2. Create a driver scheme as an interface from the existing scheme's Fortran module to the CCPP Framework. @@ -33,11 +50,11 @@ Method 1 is the preferred method of adapting a scheme to CCPP. This involves mak While method 1 is preferred, there are cases where method 1 may not be possible: for example, in schemes that are shared with other, non-CCPP hosts, and so require specialized, model-specific drivers, and might be beholden to different coding standards required by another model. In cases such as this, method 2 may be employed. -Method 2 involves fewer changes to the original scheme's Fortran module: A CCPP-compliant driver module (see :numref:`Chapter %s `) handles defining the inputs to and outputs from the scheme module in terms of state variables, constants, and tendencies provided by the model as defined in the scheme's .meta file. The calculation of variables that are not available directly from the model, and conversion of scheme output back into the variables expected by CCPP, should be handled by interstitial schemes (``schemename_pre`` and ``schemename_post``). While this method puts most CCPP-required features in the driver and interstitial subroutines, the original scheme must still be updated to remove STOP statements, common blocks, or any other disallowed features as listed in :numref:`Section %s `. +Method 2 involves fewer changes to the original scheme's Fortran module: A CCPP-compliant driver module (see :numref:`Chapter %s `) handles defining the inputs to and outputs from the scheme module in terms of state variables, constants, and tendencies provided by the model as defined in the scheme's .meta file. The calculation of variables that are not available directly from the model, and conversion of scheme output back into the variables expected by CCPP, should be handled by interstitial schemes (``schemename_pre`` and ``schemename_post``). While this method puts most CCPP-required features in the driver and interstitial subroutines, the original scheme must still be updated to remove STOP statements, common blocks, or any other disallowed features as listed in :numref:`Section %s `. For both methods, optional interstitial schemes can be used for code that can not be handled within the scheme itself. For example, if different code needs to be run for coupling with other schemes or in different orders (e.g. because of dependencies on other schemes and/or the order the scheme is run in the :term:`SDF`), or if variables needed by the scheme must be derived from variables provided by the host. See :numref:`Chapter %s ` for more details on primary and interstitial schemes. - .. note:: Depending on the complexity of the scheme and how it works together with other schemes, multiple interstitial schemes may be necessary. + .. note:: Depending on the complexity of the scheme and how it works together with other schemes, multiple interstitial schemes may be necessary. ------------------------------ Adding new variables to CCPP @@ -61,29 +78,29 @@ For variables that can be set via namelist, the ``GFS_control_type`` Derived Dat If information from the previous timestep is needed, it is important to identify if the host model provides this information, or if it needs to be stored as a special variable. For example, in the Model for Prediction Across Scales (MPAS), variables containing the values of several quantities in the preceding timesteps are available. When that is not the case, as in the :term:`UFS Atmosphere`, interstitial schemes are needed to access these quantities. - .. note:: As an example, the reader is referred to the `GF convective scheme `_, which makes use of interstitials to obtain the previous timestep information. + .. note:: As an example, the reader is referred to the `Grell-Freitas convective scheme `_, which makes use of interstitials to obtain the previous timestep information. Consider allocating the new variable only when needed (i.e. when the new scheme is used and/or when a certain control flag is set). If this is a viable option, following the existing examples in ``CCPP_typedefs.F90`` and ``GFS_typedefs.meta`` for allocating the variable and setting the ``active`` attribute in the metadata correctly. ---------------------------------- Incorporating a scheme into CCPP ---------------------------------- -The new scheme and any interstitials will need to be added to the CCPP prebuild configuration file. Add the new scheme to the Python dictionary in ``ccpp-scm/ccpp/config/ccpp_prebuild_config.py`` using the same path as the existing schemes: +The new scheme and any interstitials will need to be added to the CCPP prebuild configuration file. Add the new scheme to the Python dictionary in `ccpp-scm/ccpp/config/ccpp_prebuild_config.py `__ using the same path as the existing schemes: -.. code-block:: +.. code-block:: SCHEME_FILES = [ ... '../some_relative_path/existing_scheme.F90', '../some_relative_path/new_scheme.F90', ...] - .. note:: Different host models will have different prebuild config files. For example, the :term:`UFS Atmosphere's ` config file is located at ``ufs-weather-model/FV3/ccpp/config/ccpp_prebuild_config.py`` + .. note:: Different host models will have different prebuild config files. For example, the :term:`UFS Atmosphere's ` config file is located at `ufs-weather-model/FV3/ccpp/config/ccpp_prebuild_config.py `__ -The source code and ``.meta`` files for the new scheme should be placed in the same location as existing schemes in the CCPP: in the ccpp-physics repository under the ``physics/`` directory. +The source code and ``.meta`` files for the new scheme should be placed in the same directory. Individual schemes are contained in their own subdirectory within the ccpp-physics repository under the ``physics/`` directory, optionally under a directory describing the type of physics scheme. For example, the Grell-Freitas convective scheme is located in the ccpp-physics repository at `physics/CONV/Grell_Freitas `__ -To add this new scheme to a suite definition file (:term:`SDF`) for running within a :term:`host model`, follow the examples found in ``ccpp-scm/ccpp/suites``. For more information about suites and SDFs, see :numref:`Chapter %s `. +To add this new scheme to a suite definition file (:term:`SDF`) for running within a :term:`host model`, follow the examples found in `ccpp-scm/ccpp/suites `__. For more information about suites and SDFs, see :numref:`Chapter %s `. - .. note:: For the :term:`UFS Atmosphere`, suites can be found in the ``ufs-weather-model/FV3/ccpp/suites`` directory + .. note:: For the :term:`UFS Atmosphere`, suites can be found in the `ufs-weather-model/FV3/ccpp/suites `__ directory No further modifications of the build system are required, since the :term:`CCPP Framework` will auto-generate the necessary makefiles that allow the host model to compile the scheme. @@ -113,7 +130,7 @@ Some tips for debugging problems: * Make sure to use an uppercase suffix ``.F90`` to enable C preprocessing. * A scheme called GFS_debug (GFS_debug.F90) may be added to the SDF where needed to print state variables and interstitial variables. If needed, edit the scheme beforehand to add new variables that need to be printed. * Check the ``ccpp_prebuild.py`` script for success/failure and associated messages; run the prebuild script with the --debug and --verbose flags. See :numref:`Chapter %s ` for more details -* Compile code in DEBUG mode (see section 2.3 of the `SCM User's Guide `_, run through debugger if necessary (gdb, Allinea DDT, totalview, …). +* Compile code in DEBUG mode (see section 4.3 of the `SCM User's Guide `_, run through debugger if necessary (gdb, Linaro/Arm DDT, totalview, …). * Use memory check utilities such as ``valgrind``. * Double-check the metadata file associated with your scheme to make sure that all information, including standard names and units, correspond to the correct local variables. diff --git a/CCPPtechnical/source/AutoGenPhysCaps.rst b/CCPPtechnical/source/AutoGenPhysCaps.rst index d7d8b18..ae70572 100644 --- a/CCPPtechnical/source/AutoGenPhysCaps.rst +++ b/CCPPtechnical/source/AutoGenPhysCaps.rst @@ -4,23 +4,23 @@ Suite and Group *Caps* **************************************** -The connection between the :term:`host model` and the physics :term:`schemes` through the :term:`CCPP Framework` +The connection between the :term:`host model` and the physics :term:`schemes` through the :term:`CCPP Framework` is realized with :term:`caps` on both sides as illustrated in :numref:`Figure %s `. The CCPP *prebuild* script discussed in :numref:`Chapter %s ` generates the :term:`caps ` that connect the physics schemes to the CCPP Framework. -This chapter describes the :term:`suite` and :term:`group caps`, +This chapter describes the :term:`suite` and :term:`group caps`, while the host model *caps* are described in :numref:`Chapter %s `. -These *caps* autogenerated by ``ccpp_prebuild.py`` reside in the directory +These *caps* autogenerated by ``ccpp_prebuild.py`` reside in the directory defined by the ``CAPS_DIR`` variable (see example in :ref:`Listing 8.1 `). Overview ======== -When CCPP is built, the CCPP Framework and physics are statically linked to the executable. This allows the best -performance and efficient memory use. This build requires metadata provided +When CCPP is built, the CCPP Framework and physics are statically linked to the executable. This allows the best +performance and efficient memory use. This build requires metadata provided by the host model and variables requested from the physics scheme. Only the variables required for the specified suites are kept, requiring one or more :term:`SDF`\ s (see left side of :numref:`Figure %s `) -as arguments to the ``ccpp_prebuild.py`` script. +as arguments to the ``ccpp_prebuild.py`` script. The CCPP *prebuild* step performs the tasks below. * Check requested vs provided variables by ``standard_name``. @@ -83,9 +83,9 @@ The *prebuild* step will produce the following files for any host model. Note th ccpp_static_api.F90 ``ccpp_static_api.F90`` is an interface, which contains subroutines ``ccpp_physics_init``, -``ccpp_physics_timestep_init``, ``ccpp_physics_run``, ``ccpp_physics_timestep_finalize``, and ``ccpp_physics_finalize``. -Each subroutine uses a ``suite_name`` and an optional argument, ``group_name``, to call the groups -of a specified suite (e.g. ``fast_physics``, ``physics``, ``time_vary``, ``radiation``, ``stochastic``, etc.), +``ccpp_physics_timestep_init``, ``ccpp_physics_run``, ``ccpp_physics_timestep_finalize``, and ``ccpp_physics_finalize``. +Each subroutine uses a ``suite_name`` and an optional argument, ``group_name``, to call the groups +of a specified suite (e.g. ``fast_physics``, ``physics``, ``time_vary``, ``radiation``, ``stochastic``, etc.), or to call the entire suite. For example, ``ccpp_static_api.F90`` would contain module ``ccpp_static_api`` with subroutines ``ccpp_physics_{init, timestep_init, run, timestep_finalize, finalize}``. Interested users should run ``ccpp_prebuild.py`` as appropriate for their model and inspect these auto-generated files. diff --git a/CCPPtechnical/source/CCPPDebug.rst b/CCPPtechnical/source/CCPPDebug.rst index 9cb0770..a97a4e8 100644 --- a/CCPPtechnical/source/CCPPDebug.rst +++ b/CCPPtechnical/source/CCPPDebug.rst @@ -8,7 +8,7 @@ Debugging with CCPP Introduction ================================ -In order to debug code efficiently with :term:`CCPP`, it is important to remember the conceptual differences between traditional, physics-driver based approaches and the ones with CCPP. +In order to debug code efficiently with :term:`CCPP`, it is important to remember the conceptual differences between traditional, physics-driver based approaches and the ones with CCPP. Traditional, physics-driver based approaches rely on hand-written physics drivers that connect the different physical :term:`parameterizations ` together and often contain a large amount of "glue code" required between the parameterizations. As such, the physics drivers usually have access to all variables that are used by the physical parameterizations, while individual parameterizations only have access to the variables that are passed in. Debugging either happens on the level of the physics driver or inside physical parameterizations. In both cases, print statements are inserted in one or more places (e.g. in the driver before/after parameterizations to debug). In the CCPP, there are no hand-written physics drivers. Instead, the physical parameterizations are glued together by an :term:`SDF` that lists the :term:`primary physical parameterizations ` and so-called :term:`interstitial parameterizations ` or interstitial schemes (containing the glue code, broken up into logical units) in the order of execution. @@ -34,12 +34,12 @@ Two categories of debugging with CCPP ============================================ CCPP-compliant debugging schemes for the UFS ============================================ -For the UFS models, dedicated debugging schemes have been created by the CCPP developers. These schemes can be found in ``FV3/ccpp/physics/physics/GFS_debug.F90``. Developers can use the schemes as-is or customize and add to them as needed. Also, several customization options are documented at the top of the file. These mainly deal with the amount/type of data/information output from arrays, and users can switch between them by turning on or off the corresponding preprocessor directives inside ``GFS_debug.F90``, followed by recompiling. +For the UFS models, dedicated debugging schemes have been created by the CCPP developers. These schemes can be found in `physics/Interstitials/UFS_SCM_NEPTUNE/GFS_debug.F90 `__. Developers can use the schemes as-is or customize and add to them as needed. Also, several customization options are documented at the top of the file. These mainly deal with the amount/type of data/information output from arrays, and users can switch between them by turning on or off the corresponding preprocessor directives inside ``GFS_debug.F90``, followed by recompiling. ---------------------------------------------------------------- Descriptions of the CCPP-compliant debugging schemes for the UFS ---------------------------------------------------------------- -* ``GFS_diagtoscreen`` +* ``GFS_diagtoscreen`` This scheme loops over all blocks for all GFS types that are persistent from one time step to the next (except ``GFS_control``) and prints data for almost all constituents. The call signature and rough outline for this scheme is: .. code-block:: console @@ -83,9 +83,9 @@ Descriptions of the CCPP-compliant debugging schemes for the UFS * ``GFS_interstitialtoscreen`` This scheme is identical to ``GFS_diagtoscreen``, except that it prints data for all constituents of the ``GFS_interstitial`` derived data type only. As for ``GFS_diagtoscreen``, the amount of information printed to screen can be customized using preprocessor statements, and all output to ``stdout/stderr`` from this routine is prefixed with **'XXX: '** so that it can be easily removed from the log files using "grep -ve 'XXX: ' ..." if needed. - - - + + + * ``GFS_abort`` This scheme can be used to terminate a model run at some point in the call to the physics. It can be customized to meet the developer's requirements. @@ -115,7 +115,7 @@ Descriptions of the CCPP-compliant debugging schemes for the UFS * ``GFS_checkland`` This routine is an example of a user-provided debugging scheme that is useful for solving issues with the fractional grid with the Rapid Update Cycle Land Surface Model (RUC LSM). All output to ``stdout/stderr`` from this routine is prefixed with **'YYY: '** (instead of ‘XXX:’), which can be easily removed from the log files using "grep -ve 'YYY: ' ..." if needed. - + .. code-block:: console subroutine GFS_checkland_run (me, master, blkno, im, kdt, iter, flag_iter, flag_guess, & @@ -141,7 +141,7 @@ Descriptions of the CCPP-compliant debugging schemes for the UFS write(0,'(a,2i5,1x,e16.7)')'YYY: i, blk, oceanfrac(i)  :', i, blkno, oceanfrac(i) write(0,'(a,2i5,1x,e16.7)')'YYY: i, blk, landfrac(i)   :', i, blkno, landfrac(i) write(0,'(a,2i5,1x,e16.7)')'YYY: i, blk, lakefrac(i)   :', i, blkno, lakefrac(i) - write(0,'(a,2i5,1x,e16.7)')'YYY: i, blk, slmsk(i)      :', i, blkno, slmsk(i) + write(0,'(a,2i5,1x,e16.7)')'YYY: i, blk, slmsk(i)      :', i, blkno, slmsk(i) write(0,'(a,2i5,1x,i5)')   'YYY: i, blk, islmsk(i)     :', i, blkno, islmsk(i) !end if end do diff --git a/CCPPtechnical/source/CCPPPreBuild.rst b/CCPPtechnical/source/CCPPPreBuild.rst index 5290b64..efaca4b 100644 --- a/CCPPtechnical/source/CCPPPreBuild.rst +++ b/CCPPtechnical/source/CCPPPreBuild.rst @@ -17,7 +17,7 @@ The CCPP *prebuild* script automates several tasks based on the information coll on the host model side and from the individual physics schemes (``.meta`` files; see :numref:`Figure %s `): * Compiles a list of variables provided by the host model. - + * Compiles a list of variables required to run all schemes in the CCPP Physics pool. * Matches these variables by their ``standard_name``, checks for missing variables and mismatches of their @@ -48,7 +48,7 @@ Script Configuration To connect the CCPP with a host model ``XYZ``, a Python-based configuration file for this model must be created in the host model’s repository. The easiest way is to copy an existing configuration file for the SCM in sub-directory ``ccpp/config`` of the ccpp-scm repository. The configuration in ``ccpp_prebuild_config.py`` depends largely on (a) the directory structure of the host model itself, (b) where the ``ccpp-framework`` and the ``ccpp-physics`` directories are located relative to the directory structure of the host model, and (c) from which directory the ``ccpp_prebuild.py`` script is executed before/during the build process (this is referred to as basedir in ``ccpp_prebuild_config_XYZ.py``). -:ref:`Listing 8.1 ` contains an example for the CCPP-SCM prebuild config. Here, both ``ccpp-framework`` and ``ccpp-physics`` are located in directories ``ccpp/framework`` and ``ccpp/physics`` of the top-level directory of the host model, and ``ccpp_prebuild.py`` is executed from the same top-level directory. +:ref:`Listing 8.1 ` contains a simplified example for the CCPP-SCM prebuild config. Here, both ``ccpp-framework`` and ``ccpp-physics`` are located in directories ``ccpp/framework`` and ``ccpp/physics`` of the top-level directory of the host model, and ``ccpp_prebuild.py`` is executed from the same top-level directory. .. _ccpp_prebuild_example: @@ -79,8 +79,9 @@ To connect the CCPP with a host model ``XYZ``, a Python-based configuration file # Add all physics scheme files relative to basedir SCHEME_FILES = { - ’ccpp/physics/physics/GFS_DCNV_generic.f90’ , - 'ccpp/physics/physics/sfc_sice.f’, + 'ccpp/physics/physics/Interstitials/UFS_SCM_NEPTUNE/GFS_DCNV_generic_pre.F90' , + 'ccpp/physics/physics/Interstitials/UFS_SCM_NEPTUNE/GFS_DCNV_generic_post.F90' , + 'ccpp/physics/physics/SFC_Models/SeaIce/CICE/sfc_sice.f’, } # Default build dir, relative to current working directory, @@ -157,7 +158,7 @@ The :term:`SDF`\(s) to compile into the executable can be specified using the `` ./ccpp/framework/scripts/ccpp_prebuild.py \ --config=./ccpp/config/ccpp_prebuild_config.py \ - --suites=FV3_GFS_v16,RRFS_v1beta + --suites=FV3_GFS_v16,FV3_GFS_v17_p8_ugwpv1 .. note:: @@ -174,13 +175,13 @@ To remove all files created by ``ccpp_prebuild.py``, for example as part of a ho .. code-block:: console ./ccpp/framework/scripts/ccpp_prebuild.py --config=./ccpp/config/ccpp_prebuild_config.py \ - --suites=FV3_GFS_v16,RRFS_v1beta --clean + --suites=FV3_GFS_v16,FV3_GFS_v17_p8_ugwpv1 --clean ============================= Troubleshooting ============================= -If invoking the ``ccpp_prebuild.py`` script fails, some message other than the success message will be written to the terminal output. Specifically, the terminal output will include informational logging messages generated from the script and any error messages written to the Python logging utility. Some common errors (minus the typical logging output and traceback output) and solutions are described below, with non-bold font used to denote aspects of the message that will differ depending on the problem encountered. This is not an exhaustive list of possible errors, however. For example, in this version of the code, there is no cross-checking that the metadata information provided corresponds to the actual Fortran code, so even though ``ccpp_prebuild.py`` may complete successfully, there may be related compilation errors later in the build process. For further help with an undescribed error, you can make a post in the appropriate GitHub discussions forum for *CCPP Physics* (https://github.com/NCAR/ccpp-physics/discussions) or *CCPP Framework* (https://github.com/NCAR/ccpp-framework/discussions). +If invoking the ``ccpp_prebuild.py`` script fails, some message other than the success message will be written to the terminal output. Specifically, the terminal output will include informational logging messages generated from the script and any error messages written to the Python logging utility. Some common errors (minus the typical logging output and traceback output) and solutions are described below, with non-bold font used to denote aspects of the message that will differ depending on the problem encountered. This is not an exhaustive list of possible errors, however. For example, in this version of the code, there is no cross-checking that the metadata information provided corresponds to the actual Fortran code, so even though ``ccpp_prebuild.py`` may complete successfully, there may be related compilation errors later in the build process. For further help with an undescribed error, you can make a post in the appropriate GitHub discussions forum for *CCPP Physics* (https://github.com/NCAR/ccpp-physics/discussions) or *CCPP Framework* (https://github.com/NCAR/ccpp-framework/discussions). #. ``ERROR: Configuration file`` erroneous/path/to/config/file ``not found`` @@ -315,7 +316,7 @@ CCPP Physics Variable Tracker New in version 6.0, CCPP includes a tool that allows users to track a given variable's journey through a specified physics suite. This tool, ``ccpp-framework/scripts/ccpp_track_variables.py``, -given a :term:`suite definition file` and the :term:`standard name` of a variable, +given a :term:`suite definition file` and the :term:`standard name` of a variable, will output the list of subroutines that use this variable -- in the order that they are called -- as well as the variable's Fortran *intent* (``in``, ``out``, or ``inout``) within that subroutine. This can allow the user to more easily @@ -340,38 +341,46 @@ how to use the script: --debug enable debugging output For this initial implementation, this script must be executed from within a :term:`host model`, and must be -called from the same directory that the ``ccpp_prebuild.py`` script is called from. This first +called from the same directory that the ``ccpp_prebuild.py`` script is called from. This first example is called using the :term:`UFS Atmosphere` as a host model, from the directory ``ufs-weather-model/FV3/ccpp``: .. code-block:: console framework/scripts/ccpp_track_variables.py -c=config/ccpp_prebuild_config.py \ - -s=suites/suite_FV3_RRFS_v1beta.xml -v air_temperature_of_new_state -m ./physics/physics/ - For suite suites/suite_FV3_RRFS_v1beta.xml, the following schemes (in order for each group) use the variable air_temperature_of_new_state: + -s=suites/suite_FV3_HRRR_gf.xml -v air_temperature_of_new_state -m './physics/physics/**/' + For suite suites/suite_FV3_HRRR_gf.xml, the following schemes (in order for each group) use the variable air_temperature_of_new_state: In group physics GFS_suite_stateout_reset_run (intent out) dcyc2t3_run (intent in) + clm_lake_run (intent in) + clm_lake_run (intent in) + rrfs_smoke_wrapper_run (intent inout) GFS_suite_stateout_update_run (intent out) - ozphys_2015_run (intent in) get_phi_fv3_run (intent in) GFS_suite_interstitial_3_run (intent in) + GFS_DCNV_generic_pre_run (intent in) + cu_gf_driver_run (intent inout) + GFS_DCNV_generic_post_run (intent in) GFS_MP_generic_pre_run (intent in) mp_thompson_pre_run (intent in) mp_thompson_run (intent inout) mp_thompson_post_run (intent inout) GFS_MP_generic_post_run (intent in) + cu_gf_driver_post_run (intent in) maximum_hourly_diagnostics_run (intent in) In group stochastics GFS_stochastics_run (intent inout) + In the example above, we can see that the variable ``air_temperature_of_new_state`` is used in -the FV3_RRFS_v1beta suite by several microphysics-related schemes, as well as by the stochastics :term:`parameterization`. +the FV3_HRRR_gf suite by several microphysics-related schemes, as well as by a stochastics :term:`parameterization`. +We used the argument `'./physics/physics/**/'` for the metadata path because the CCPP physics metadata files are contained in multiple levels of subdirectories under ``./physics/physics``; the double-`*` includes all levels of subdirectories in the search. To learn more about a given subroutine, you can search the physics source code within the ``ccpp-physics`` repository, or you can consult the `CCPP Scientific Documentation -`_: typing the subroutine name into the search +`_: typing the subroutine name into the search bar should lead you to further information about the subroutine and how it ties into its associated physics scheme. -In addition, because of the naming conventions for subroutines in CCPP-compliant physics schemes, +In addition, because of the naming conventions for subroutines in CCPP-compliant physics schemes, we can typically see which scheme, as well as which :term:`phase` within that scheme, is associated with the listed subroutine, without having to consult any further documentation or source code. For example, the ``mp_thompson_run`` subroutine is part of the Thompson microphysics scheme, specifically the *run* phase of that scheme. @@ -381,8 +390,8 @@ This second example is called using the :term:`SCM` as a host model: .. code-block:: console ccpp/framework/scripts/ccpp_track_variables.py --config=ccpp/config/ccpp_prebuild_config.py \ - -s=ccpp/suites/suite_SCM_GFS_v17_p8.xml -v surface_friction_velocity_over_land -m ./ccpp/physics/physics/ - For suite ccpp/suites/suite_SCM_GFS_v17_p8.xml, the following schemes (in order for each group) use the variable surface_friction_velocity_over_land: + -s=ccpp/suites/suite_SCM_GFS_v17_p8_ugwpv1.xml -v surface_friction_velocity_over_land -m './ccpp/physics/physics/**/' + For suite ccpp/suites/suite_SCM_GFS_v17_p8_ugwpv1.xml, the following schemes (in order for each group) use the variable surface_friction_velocity_over_land: In group physics GFS_surface_composites_pre_run (intent inout) sfc_diff_run (intent inout) @@ -391,38 +400,41 @@ This second example is called using the :term:`SCM` as a host model: noahmpdrv_run (intent inout) GFS_surface_composites_post_run (intent in) -In the example above, we can see that the variable ``wind_speed_at_lowest_model_layer`` is used in a few subroutines, -two of which (``sfc_diff_run`` and ``noahmpdrv_run`` are listed twice). This is not an error! The +In the example above, we can see that the variable ``surface_friction_velocity_over_land`` is used in a few subroutines, +two of which (``sfc_diff_run`` and ``noahmpdrv_run``) are listed twice. This is not an error! The two repeated subroutines are part of a scheme called in a :term:`subcycle `, and so they are called twice in this cycle as designated in the SDF. The ``ccpp_track_variables.py`` script lists the subroutines in the exact order they are called (within each *group*), including subcycles. -Some standard names can be exceedingly long and hard to remember, and it is not always convenient to search the full list of standard names for the exact variable you want. Therefore, this script will also return matches for partial variable names. In this example, we will look for the variable "velocity", which is not a standard name of any variable, and see what it returns: +Some standard names can be exceedingly long and hard to remember, and it is not always convenient to search the full list of standard names for the exact variable you want. Therefore, this script will also return matches for partial variable names. In this example, we will look for the variable "velocity" (which is not a standard name of any variable), and see what it returns: .. code-block:: console - framework/scripts/ccpp_track_variables.py --config=config/ccpp_prebuild_config.py \ - -s=suites/suite_FV3_GFS_v16.xml -v velocity -m ./physics/physics/ - Variable velocity not found in any suites for sdf suites/suite_FV3_GFS_v16.xml + ccpp/framework/scripts/ccpp_track_variables.py --config=ccpp/config/ccpp_prebuild_config.py \ + -s=ccpp/suites/suite_SCM_GFS_v16_RRTMGP.xml -v velocity -m './ccpp/physics/physics/**/' + Variable velocity not found in any suites for sdf ccpp/suites/suite_SCM_GFS_v16_RRTMGP.xml - ERROR:ccpp_track_variables:Variable velocity not found in any suites for sdf suites/suite_FV3_GFS_v16.xml + ERROR:ccpp_track_variables:Variable velocity not found in any suites for sdf ccpp/suites/suite_SCM_GFS_v16_RRTMGP.xml Did find partial matches that may be of interest: In GFS_surface_composites_pre_run found variable(s) ['surface_friction_velocity', 'surface_friction_velocity_over_water', 'surface_friction_velocity_over_land', 'surface_friction_velocity_over_ice'] In sfc_diff_run found variable(s) ['surface_friction_velocity_over_water', 'surface_friction_velocity_over_land', 'surface_friction_velocity_over_ice'] In GFS_surface_composites_post_run found variable(s) ['surface_friction_velocity', 'surface_friction_velocity_over_water', 'surface_friction_velocity_over_land', 'surface_friction_velocity_over_ice'] + In sfc_diag_run found variable(s) ['surface_friction_velocity'] In cires_ugwp_run found variable(s) ['angular_velocity_of_earth'] In samfdeepcnv_run found variable(s) ['vertical_velocity_for_updraft', 'cellular_automata_vertical_velocity_perturbation_threshold_for_deep_convection'] + In maximum_hourly_diagnostics_run found variable(s) ['unsmoothed_nonhydrostatic_upward_air_velocity'] While the script did not find the variable specified, it did find several partial matches -- ``surface_friction_velocity``, ``surface_friction_velocity_over_water``, ``surface_friction_velocity_over_land``, etc. -- as well as the subroutines they were found in. You can then use this more specific information to refine your next query: .. code-block:: console - framework/scripts/ccpp_track_variables.py --config=config/ccpp_prebuild_config.py \ - -s=suites/suite_FV3_GFS_v16.xml -v surface_friction_velocity -m ./physics/physics/ - For suite suites/suite_FV3_GFS_v16.xml, the following schemes (in order for each group) use the variable surface_friction_velocity: + ccpp/framework/scripts/ccpp_track_variables.py --config=ccpp/config/ccpp_prebuild_config.py \ + -s=ccpp/suites/suite_SCM_GFS_v16_RRTMGP.xml -v surface_friction_velocity -m './ccpp/physics/physics/**/' + For suite ccpp/suites/suite_SCM_GFS_v16_RRTMGP.xml, the following schemes (in order for each group) use the variable surface_friction_velocity: In group physics GFS_surface_composites_pre_run (intent in) GFS_surface_composites_post_run (intent inout) + sfc_diag_run (intent in) diff --git a/CCPPtechnical/source/CodeManagement.rst b/CCPPtechnical/source/CodeManagement.rst index d926d18..fbc1a1e 100644 --- a/CCPPtechnical/source/CodeManagement.rst +++ b/CCPPtechnical/source/CodeManagement.rst @@ -16,7 +16,7 @@ The repository and code organization differs for :term:`CCPP Framework` and :ter CCPP Framework -------------------------------------- -The CCPP Framework code base can be found in the authoritative repository in the :term:`NCAR` GitHub organization (https://github.com/NCAR/ccpp-framework). This repository is public and can be viewed, downloaded, or cloned by users without needing a GitHub account. +The CCPP Framework code base can be found in the authoritative repository in the :term:`NCAR` GitHub organization (https://github.com/NCAR/ccpp-framework). This repository is public and can be viewed, downloaded, or cloned by users without needing a GitHub account. Developers seeking to contribute code to the CCPP should create a GitHub account and set up a personal fork in order to introduce changes to the official code base via a Pull Request (PR) on GitHub (see `Creating Forks`_). @@ -96,7 +96,7 @@ Checking out the Code ----------------------------------- Instructions are provided here for the ccpp-physics repository assuming development intended for use in UFS Applications. The instructions for the ccpp-framework repository are analogous but should start from the main repository in the NCAR GitHub Organization (https://github.com/NCAR/ccpp-framework). -The process for checking out the CCPP is described in the following, assuming access via https (using a `personal access token `_) rather than ssh. If you are using an `ssh key `_ instead, you should replace instances of ``https://github.com/`` with ``git@github.com:`` in repository URLs. +The process for checking out the CCPP is described in the following, assuming access via https (using a `personal access token `_) rather than ssh. If you are using an `ssh key `_ instead, you should replace instances of ``https://github.com/`` with ``git@github.com:`` in repository URLs. Start by checking out the UFS Application Fork: @@ -120,20 +120,20 @@ From here you can view the available branches in the ccpp-physics repository wit remotes/upstream/HEAD -> upstream/ufs/dev remotes/upstream/ufs/dev -As you can see, you are placed on the ``ufs/dev`` branch by default; this is the most recent version of the development code in the ccpp-physics repository. In the ccpp-framework repository, the default branch is named ``main``. All new development should start from the default branch, but if you would like to view code from another branch this is simple with the ``git checkout`` command. +As you can see, you are placed on the ``ufs/dev`` branch by default; this is the most recent version of the development code in the ccpp-physics repository. In the ccpp-framework repository, the default branch is named ``main``. All new development should start from the default branch, but if you would like to view code from another branch this is simple with the ``git checkout`` command. .. code-block:: console :emphasize-lines: 3-4 - git checkout release/public-v6 + git checkout release/public-v7 - branch 'release/public-v6' set up to track 'upstream/release/public-v6'. - Switched to a new branch 'release/public-v6' + branch 'release/public-v7' set up to track 'upstream/release/public-v7'. + Switched to a new branch 'release/public-v7' .. note:: Never used git or GitHub before? Confused by what all this means or why we do it? Check out `this presentation from the UFS SRW Training workshop `_ for a "from basic principles" explanation! -After this command, git has checked out a local copy of the remote branch ``upstream/release/public-v6`` named ``release/public-v6``. To return to the ufs/dev branch, simply use ``git checkout ufs/dev``. +After this command, git has checked out a local copy of the remote branch ``upstream/release/public-v7`` named ``release/public-v7``. To return to the ufs/dev branch, simply use ``git checkout ufs/dev``. If you wish to make changes that you will eventually contribute back to the public code base, you should always create a new "feature" branch that will track those particular changes. @@ -144,9 +144,9 @@ If you wish to make changes that you will eventually contribute back to the publ .. note:: - By checking out the remote ``upstream/ufs/dev`` branch directly, you will be left in a so-called '`detached HEAD `_' state. This will prompt git to show you a scary-looking warning message, but it can be ignored so long as you follow it by the second command above to create a new branch. + By checking out the remote ``upstream/ufs/dev`` branch directly, you will be left in a so-called '`detached HEAD `_' state. This will prompt git to show you a scary-looking warning message, but it can be ignored so long as you follow it by the second command above to create a new branch. -You can now make changes to the code, and commit those changes locally using ``git commit`` in order to track +You can now make changes to the code, and commit those changes locally using ``git commit`` in order to track diff --git a/CCPPtechnical/source/CompliantPhysicsParams.rst b/CCPPtechnical/source/CompliantPhysicsParams.rst index b84e724..57e1bd2 100644 --- a/CCPPtechnical/source/CompliantPhysicsParams.rst +++ b/CCPPtechnical/source/CompliantPhysicsParams.rst @@ -30,12 +30,12 @@ The implementation of a driver is reasonable under the following circumstances: * To preserve schemes that are also distributed outside of the CCPP. For example, the Thompson microphysics scheme is distributed both with the Weather Research and Forecasting (WRF) model and with CCPP. Having a driver with CCPP directives allows the Thompson scheme to remain - intact so that it can be synchronized between the WRF model and the CCPP distributions. You - can view this driver module in ``ccpp-physics/physics/mp_thompson.F90``. + intact so that it can be synchronized between the WRF model and the CCPP distributions. You + can view this driver module in `ccpp-physics/physics/MP/Thompson/mp_thompson.F90 `__. * To perform array transformations, such as flipping the vertical direction or rearranging the index order: for example, in the subroutine ``gfdl_cloud_microphys_run`` - in ``ccpp-physics/physics/gfdl_cloud_microphys.F90``. + in `ccpp-physics/physics/gfdl_cloud_microphys.F90 `__. Schemes in the CCPP are classified into two categories: :term:`primary schemes ` and :term:`interstitial schemes `. A *primary* scheme is one that updates the state variables and tracers or that @@ -64,7 +64,7 @@ CCPP-compliant physics parameterizations are broken down into one or more of the * The *init* phase, which performs actions needed to set up the scheme before the model integration begins. Examples of actions needed in this phase include the reading/computation of lookup tables, setting of constants (as described in :numref:`Section %s `), etc. -* The *timestep_init* phase, which performs actions needed at the start of each physics timestep. +* The *timestep_init* phase, which performs actions needed at the start of each physics timestep. Examples of actions needed in this phase include updating of time-based settings (e.g. solar angle), reading lookup table values, etc. * The *run* phase, which is the main body of the scheme. Here is where the physics is integrated @@ -86,7 +86,7 @@ A CCPP-compliant scheme is written in the form of Fortran modules. Each scheme m following subroutines (*entry points*): *_init*, *_timestep_init*, *_run*, *_timestep_finalize*, and *_finalize*. Each subroutine corresponds to one of the five *phases* of the :term:`CCPP Framework` as described above. The module name and the subroutine names must be consistent with the -scheme name; for example, the scheme "schemename" can have the entry points *schemename_init*, +scheme name; for example, the scheme "schemename" can have the entry points *schemename_init*, *schemename_run*, etc. The *_run* subroutine contains the code to execute the scheme. If subroutines *_timestep_init* or *_timestep_finalize* are present, they will be executed at the beginning and at the end of the :term:`host model` physics timestep, @@ -99,9 +99,9 @@ by using a module variable ``is_initialized`` that keeps track whether a scheme initialized or not. -:ref:`Listing 2.1 ` contains a template for a CCPP-compliant scheme, which +:ref:`Listing 2.1 ` contains a template for a CCPP-compliant scheme, which includes the *_run* subroutine for an example *scheme_template* scheme. Each ``.F`` or ``.F90`` -file that contains one or more entry point for a CCPP scheme must be accompanied by a .meta file in the +file that contains one or more entry point for a CCPP scheme must be accompanied by a .meta file in the same directory as described in :numref:`Section %s ` .. _scheme_template: @@ -111,7 +111,7 @@ same directory as described in :numref:`Section %s ` *Listing 2.1: Fortran template for a CCPP-compliant scheme showing the _run subroutine. The structure for the other phases (*\ _timestep_init, _init, _finalize, *and* _timestep_finalize\ *) is identical.* -The three lines in the example template beginning ``!> \section`` are required. They begin with `!` and so will be treated as comments by the Fortran compiler, but are interpreted by Doxygen +The three lines in the example template beginning ``!> \section`` are required. They begin with `!` and so will be treated as comments by the Fortran compiler, but are interpreted by Doxygen as part of the process to create scientific documentation. Those lines specifically insert an external file containing metadata information (in this case, ``scheme_template_run.html``) in the documentation. See more on this topic in :numref:`Section %s `. @@ -123,7 +123,7 @@ how to use physical constants. Note that :term:`standard names `, variable names, module names, scheme names and subroutine names are all case insensitive. -Interstitial modules (*schemename_pre* and *schemename_post*) can be included if any part of the +Interstitial modules (*schemename_pre* and *schemename_post*) can be included if any part of the physics scheme must be executed sequentially before (*_pre*) or after (*_post*) the scheme, but can not be included in the scheme itself (e.g., for including host-specific code). @@ -165,7 +165,7 @@ The ``[ccpp-table-properties]`` section is required in every metadata file and h The information in this section table allows the CCPP to compile only the schemes and dependencies needed by the selected CCPP suite(s). -An example for type and variable definitions from the file ``ccpp-physics/physics/radlw_param.meta`` is shown in +An example for type and variable definitions from the file `ccpp-physics/physics/Radiation/RRTMG/radlw_param.meta `__ is shown in :ref:`Listing 2.2 `. .. note:: @@ -179,37 +179,37 @@ An example for type and variable definitions from the file ``ccpp-physics/physic name = topflw_type type = ddt dependencies = - + [ccpp-arg-table] name = topflw_type type = ddt - + ######################################################################## [ccpp-table-properties] name = sfcflw_type type = ddt dependencies = - + [ccpp-arg-table] name = sfcflw_type type = ddt - + ######################################################################## [ccpp-table-properties] name = proflw_type type = ddt dependencies = - + [ccpp-arg-table] name = proflw_type type = ddt - + ######################################################################## [ccpp-table-properties] name = module_radlw_parameters type = module dependencies = - + [ccpp-arg-table] name = module_radlw_parameters type = module @@ -320,10 +320,14 @@ For each CCPP compliant scheme, the ``ccpp-arg-table`` for a scheme, module or d type = kind = intent = + optional = + +.. warning:: + The ``pointer`` attribute is deprecated and no longer allowed in CCPP * The ``intent`` argument is only valid in ``scheme`` metadata tables, as it is not applicable to the other ``types``. -* The following attributes are optional: ``long_name``, ``kind``. +* The following attributes are optional: ``long_name``, ``kind``, ``optional``. * Lines can be combined using ``|`` as a separator, e.g., @@ -371,7 +375,7 @@ It is important to understand the difference between these metadata dimension na Since physics developers cannot know whether a host model is passing all columns to the physics during the time integration or just a subset of it, the following rules apply to all schemes: -* Variables that depend on the horizontal decomposition must use +* Variables that depend on the horizontal decomposition must use * ``horizontal_dimension`` in the metadata tables for the following phases: *init*, *timestep_init*, *timestep_finalize*, *finalize*. @@ -389,13 +393,13 @@ names within their individual codes, but these variables must be assigned to a * the scheme's metadata table as described in :numref:`Section %s `. Standard names are listed and defined in a GitHub repository (https://github.com/ESCOMP/CCPPStandardNames), -along with rules for adding new standard names as needed. While an effort is made to comply with -existing *standard name* definitions of the Climate and Forecast (CF) conventions (http://cfconventions.org), +along with rules for adding new standard names as needed. While an effort is made to comply with +existing *standard name* definitions of the Climate and Forecast (CF) conventions (http://cfconventions.org), additional names are used in the CCPP to cover the wide range of use cases the CCPP intends to include. Each hash of the CCPP Physics repository contains information in the top-level ``README.md`` file indicating which version of the CCPPStandardNames repository corresponds to that version of CCPP code. -An up-to-date list of available standard names for a given host model can be found by running the CCPP *prebuild* script (described in :numref:`Chapter %s `), which will generate a LaTeX source file that can be compiled to produce a PDF file with all variables +An up-to-date list of available standard names for a given host model can be found by running the CCPP *prebuild* script (described in :numref:`Chapter %s `), which will generate a LaTeX source file that can be compiled to produce a PDF file with all variables defined by the host model and requested by the physics schemes. .. _IOVariableRules: @@ -486,6 +490,71 @@ Input/Output Variable (argument) Rules real(kind=kind_phys), dimension(is:,ks:), intent(inout) :: foo +* Optional arguments are allowed, and can be designated using the Fortran keyword ``optional`` in the variable declaration, and specifying ``optional = True`` in the variable attributes in the ``.meta`` file. + + +**Example from** `ccpp-physics/physics/SFC_Layer/MYNN/mynnsfc_wrapper.meta `__ **:** + + .. code-block:: fortran + + [ps] + standard_name = surface_air_pressure + long_name = surface pressure + units = Pa + dimensions = (horizontal_loop_extent) + type = real + kind = kind_phys + intent = in + [PBLH] + standard_name = atmosphere_boundary_layer_thickness + long_name = PBL thickness + units = m + dimensions = (horizontal_loop_extent) + type = real + kind = kind_phys + intent = in + [slmsk] + standard_name = area_type + long_name = landmask: sea/land/ice=0/1/2 + units = flag + dimensions = (horizontal_loop_extent) + type = real + kind = kind_phys + intent = in + + ... + ... + + [qsfc_lnd_ruc] + standard_name = water_vapor_mixing_ratio_at_surface_over_land + long_name = water vapor mixing ratio at surface over land + units = kg kg-1 + dimensions = (horizontal_loop_extent) + type = real + kind = kind_phys + intent = in + optional = True + [qsfc_ice_ruc] + standard_name = water_vapor_mixing_ratio_at_surface_over_ice + long_name = water vapor mixing ratio at surface over ice + units = kg kg-1 + dimensions = (horizontal_loop_extent) + type = real + kind = kind_phys + intent = in + optional = True + +**Example from** `ccpp-physics/physics/SFC_Layer/MYNN/mynnsfc_wrapper.F90 `__ **:** + + .. code-block:: fortran + + real(kind_phys), dimension(:), intent(in) :: & + & dx, pblh, slmsk, ps + real(kind_phys), dimension(:), intent(in),optional :: & + & qsfc_lnd_ruc, qsfc_ice_ruc + +.. note:: Optional arguments **must** appear last (after all mandatory arguments) in the argument list + .. _CodingRules: Coding Rules @@ -624,11 +693,11 @@ should be initially set to some invalid value. The above example also demonstrat finalize phase of the scheme, the ``is_initialized`` flag can be set back to false and the constants can be set back to an invalid value. -In summary, there are two ways to pass constants to a physics scheme. The first is to directly pass constants via the subroutine interface and continue passing them down to all subroutines as needed. The second is to have a user-specified scheme constants module within the scheme and to sync it once with the physical constants from the host model at initialization time. The approach to use is somewhat up to the developer. +In summary, there are two ways to pass constants to a physics scheme. The first is to directly pass constants via the subroutine interface and continue passing them down to all subroutines as needed. The second is to have a user-specified scheme constants module within the scheme and to sync it once with the physical constants from the host model at initialization time. The approach to use is somewhat up to the developer. .. note:: - Use of the *physcons* module (``ccpp-physics/physics/physcons.F90``) is **not recommended**, since it is specific to FV3 and will be removed in the future. + Use of the *physcons* module (``ccpp-physics/physics/hooks/physcons.F90``) is **not recommended**, since it is specific to FV3 and will be removed in the future. .. _ParallelProgramming: @@ -640,12 +709,17 @@ communication is done outside the physics, in which case the loops and arrays al take into account the sizes of the threaded tasks through their input indices and array dimensions. -The following rules should be observed when including OpenMP or MPI communication in a physics scheme: +`As of CCPP version 7 `_, MPI is required as a prerequisite for building the CCPP framework in a host model. +While MPI directives are not required within physics schemes, developers are encouraged to make use of them where they may result in a significant speedup. +However, the following rules should be observed when including OpenMP or MPI communication in a physics scheme: + +* MPI directives for Fortran code must be compatible with the Fortran 2008 standard, and must use the ``mpi_f08`` module + rather than the legacy ``use mpi`` or ``INCLUDE 'mpif.h'`` * CCPP standards require that in every phase but the *run* phase, blocked data structures must be combined so that their entire contents are available to a given MPI task (i.e. the data structures can not be further subdivided, or "chunked", within those phases). The *run* phase may be called by multiple threads in parallel, so data structures - may be divided into blocks for that phase. + may be divided into blocks for that phase. * Shared-memory (OpenMP) parallelization inside a scheme is allowed with the restriction that the number of OpenMP threads to use is obtained from the host model as an ``intent(in)`` @@ -664,7 +738,7 @@ The following rules should be observed when including OpenMP or MPI communicatio properties by one or more MPI processes, and its subsequent broadcast to all processes. * The implementation of reading and writing of data must be scalable to perform - efficiently from a few to thousands of tasks. + efficiently from one to thousands of tasks. * Calls to MPI and OpenMP functions, and the import of the MPI and OpenMP libraries, must be guarded by C preprocessor directives as illustrated in the following listing. @@ -674,7 +748,7 @@ The following rules should be observed when including OpenMP or MPI communicatio .. code-block:: fortran #ifdef MPI - use mpi + use mpi_f08 #endif #ifdef OPENMP use omp_lib @@ -693,9 +767,9 @@ The following rules should be observed when including OpenMP or MPI communicatio * If the error flag is set within a parallelized section of code, ensure that error flag is broadcast to all tasks/processes. Memory allocation -^^^^^^^^^^^^^^^^^ +----------------- -* Schemes should not use dynamic memory allocation on the heap. +* **Schemes should not use dynamic memory allocation on the heap.** * Schemes should not contain data that may clash when multiple non-interacting instances of the scheme are being used in one executable. This is because some host models may run multiple CCPP instances from the same executable. diff --git a/CCPPtechnical/source/ConfigBuildOptions.rst b/CCPPtechnical/source/ConfigBuildOptions.rst index db3ae21..a9e0717 100644 --- a/CCPPtechnical/source/ConfigBuildOptions.rst +++ b/CCPPtechnical/source/ConfigBuildOptions.rst @@ -1,5 +1,5 @@ .. _ConfigBuildOptions: - + ***************************************** CCPP Configuration and Build Options ***************************************** @@ -8,10 +8,10 @@ While the :term:`CCPP Framework` code, consisting of a single Fortran source fil The :term:`SCM` and the :term:`UFS Atmosphere` are supported for use with the CCPP. In the case of the UFS Atmosphere as the host model, build configuration options can be specified as cmake options to the ``build.sh`` script for manual compilation or through a regression test (RT) configuration file. Detailed instructions for building the UFS Atmosphere and the SCM are discussed in the `UFS Weather Model User Guide `_ and the `SCM User Guide `_. -For both SCM and UFS the ``ccpp_prebuild.py`` script is run automatically as a step in the build system, +For both SCM and UFS the ``ccpp_prebuild.py`` script is run automatically as a step in the build system, although it can be run manually for debugging purposes. -The path to a host-model specific configuration file is the only required argument to ``ccpp_prebuild.py``. +The path to a host-model specific configuration file is the only required argument to ``ccpp_prebuild.py``. Such files are included with the ccpp-scm and ufs-weather-model repositories, and must be included with the code of any host model to use the CCPP. :numref:`Figure %s ` depicts the main functions of the ``ccpp_prebuild.py`` script for the build. Using information included in the configuration file diff --git a/CCPPtechnical/source/ConstructingSuite.rst b/CCPPtechnical/source/ConstructingSuite.rst index 1aedeb1..1fce939 100644 --- a/CCPPtechnical/source/ConstructingSuite.rst +++ b/CCPPtechnical/source/ConstructingSuite.rst @@ -142,16 +142,16 @@ Consider the case where a model requires that some subset of physics be called o ------------------------------- -GFS v16beta Suite +GFS v16 Suite ------------------------------- -Here is the SDF for the physics suite equivalent to the GFS v16beta in the Single-Column Model (:term:`SCM`), which employs various groups and subcycling: +Here is the SDF for the physics suite equivalent to the GFS v16 in the Single-Column Model (:term:`SCM`), which employs various groups and subcycling: .. code-block:: xml - + GFS_time_vary_pre @@ -164,10 +164,10 @@ Here is the SDF for the physics suite equivalent to the GFS v16beta in the Singl GFS_suite_interstitial_rad_reset GFS_rrtmg_pre - rrtmg_sw_pre + GFS_radiation_surface + rad_sw_pre rrtmg_sw rrtmg_sw_post - rrtmg_lw_pre rrtmg_lw rrtmg_lw_post GFS_rrtmg_post @@ -209,9 +209,7 @@ Here is the SDF for the physics suite equivalent to the GFS v16beta in the Singl cires_ugwp cires_ugwp_post GFS_GWD_generic_post - rayleigh_damp GFS_suite_stateout_update - ozphys_2015 h2ophys get_phi_fv3 GFS_suite_interstitial_3 @@ -227,12 +225,12 @@ Here is the SDF for the physics suite equivalent to the GFS v16beta in the Singl gfdl_cloud_microphys GFS_MP_generic_post maximum_hourly_diagnostics - phys_tend + GFS_physics_post -The suite name is ``SCM_GFS_v16beta``. Three groups (``time_vary``, ``radiation``, and ``physics``) are used, because the physics needs to be called in different parts of the host model. The detailed explanation of each primary physics scheme can be found in scientific documentation. A short explanation of each scheme is below. +The suite name is ``SCM_GFS_v16``. Three groups (``time_vary``, ``radiation``, and ``physics``) are used, because the physics needs to be called in different parts of the host model. The detailed explanation of each primary physics scheme can be found in scientific documentation. A short explanation of each scheme is below. * ``GFS_time_vary_pre``: GFS physics suite time setup * ``GFS_rrtmg_setup``: Rapid Radiative Transfer Model for Global Circulation Models (RRTMG) setup diff --git a/CCPPtechnical/source/Glossary.rst b/CCPPtechnical/source/Glossary.rst index d85c85c..f3b7ff1 100644 --- a/CCPPtechnical/source/Glossary.rst +++ b/CCPPtechnical/source/Glossary.rst @@ -4,28 +4,33 @@ Glossary .. glossary:: CCPP - The topic of this technical guide, the Common Community Physics Package (CCPP) is a model-agnostic, + The topic of this technical guide, the Common Community Physics Package (CCPP) is a model-agnostic, well-vetted collection of codes containing atmospheric physical parameterizations and suites for use in NWP along with a framework that connects the physics to a :term:`host model` *CCPP Framework* The infrastructure that connects physics :term:`schemes ` with a :term:`host model`; also refers to a - `software repository of the same name `_ + `software repository of the same name `__ *CCPP Physics* The pool of CCPP-compliant physics schemes; also refers to a `software repository of the same name - `_ + `__ Entry point An entry point is a subroutine for one :term:`phase` of CCPP physics that is called by the :term:`host model`. Entry points are described in more detail in :numref:`Section %s ` Fast physics Physical parameterizations that require tighter coupling with the dynamical core than “slow” - physics (due to the approximated processes within the parameterization acting on a shorter + physics (due to the approximated processes within the parameterization acting on a shorter timescale) and that benefit from a smaller time step. The distinction is useful for greater accuracy, numerical stability, or both. In the UFS Atmosphere, a saturation adjustment is used in some suites and is called directly from the dynamical core for tighter coupling + GFS + The `Global Forecast System (GFS) `_ + is a global weather forecast model run by NOAA's National Centers for Environmental Prediction (NCEP), predicting + dozens of atmospheric and earth system variables out to 16 days in advance. + Group A set of physics :term:`schemes ` within a suite definition file (SDF) that are called together without intervening computations from the :term:`host application `. Groups @@ -35,9 +40,8 @@ Glossary Autogenerated interface between a :term:`group` of physics schemes and the :term:`host model`. Caps are described in more detail in :numref:`Chapter %c `. - Host model - A host model (or host application) is an atmospheric model or application that allocates memory, + A host model (or host application) is an atmospheric model or application that allocates memory, provides metadata for the variables passed into and out of the physics, and controls time-stepping @@ -52,18 +56,18 @@ Glossary more detail in :numref:`Chapter %c `. NCAR - The National Center for Atmospheric Research - a US federally funded research and development - center (FFRDC) managed by the University Corporation for Atmospheric Research (UCAR) and + The National Center for Atmospheric Research - a US federally funded research and development + center (FFRDC) managed by the University Corporation for Atmospheric Research (UCAR) and funded by the National Science Foundation (NSF). NEMS - The NOAA Environmental Modeling System - a software infrastructure that supports - NCEP/EMC’s forecast products. The coupling software is based on ESMF and the + The NOAA Environmental Modeling System - a software infrastructure that supports + NCEP/EMC’s forecast products. The coupling software is based on ESMF and the `NUOPC layer `_. Parameterization The representation, in a dynamic model, of physical effects in terms of admittedly - oversimplified parameters, rather than realistically requiring such effects to be + oversimplified parameters, rather than realistically requiring such effects to be consequences of the dynamics of the system (from the `AMS Glossary `_) Phase @@ -80,13 +84,13 @@ Glossary Primary scheme A parameterization, such as PBL, microphysics, convection, and radiation, that fits the - traditionally-accepted definition, as opposed to an interstitial scheme + traditionally-accepted definition, as opposed to an interstitial scheme Scheme A CCPP-compliant parameterization (:term:`primary scheme`) or auxiliary code (:term:`interstitial scheme`) SDF - Suite Definition File (SDF) is an external file containing information about the + Suite Definition File (SDF) is an external file containing information about the construction of a physics :term:`suite`. It describes the :term:`schemes ` that are called, in which order they are called, whether they are subcycled, and whether they are assembled into groups to be called together @@ -97,20 +101,20 @@ Glossary SCM The :term:`CCPP` Single-Column Model (SCM) is a simple 1D :term:`host model` designed to be used with the CCPP Physics and Framework as a lightweight alternative to full 3D dynamical models for testing - and development of physics :term:`schemes ` and :term:`suites `. See the `SCM User Guide `_ + and development of physics :term:`schemes ` and :term:`suites `. See the `SCM User Guide `_ for more information. Slow physics Physical parameterizations that can tolerate looser coupling with the dynamical core than “fast” physics (due to the approximated processes within the parameterization acting on a longer timescale) and that often use a longer time step. Such parameterizations - are typically grouped and calculated together (through a combination of process- and + are typically grouped and calculated together (through a combination of process- and time-splitting) in a section of an atmospheric model that is distinct from the dynamical core in the code organization Standard name Variable names based on CF conventions (http://cfconventions.org) that are uniquely - identified by the *CCPP-compliant* :term:`schemes ` and provided by a :term:`host model`. See + identified by the *CCPP-compliant* :term:`schemes ` and provided by a :term:`host model`. See :numref:`Section %s ` for more details. Subcycling @@ -122,7 +126,7 @@ Glossary well together UFS - A Unified Forecast System (UFS) is a community-based, coupled comprehensive Earth + The Unified Forecast System (UFS) is a community-based, coupled comprehensive Earth system modeling system. The UFS numerical applications span local to global domains and predictive time scales from sub-hourly analyses to seasonal predictions. It is designed to support the Weather Enterprise and to be the source system for NOAA's @@ -133,5 +137,5 @@ Glossary core and the physics UFS Weather Model - The combined global/regional medium- to short-range weather-prediction model used in the :term:`UFS` + The combined global/regional medium- to short-range weather-prediction model used in the :term:`UFS` to create forecasts diff --git a/CCPPtechnical/source/HostSideCoding.rst b/CCPPtechnical/source/HostSideCoding.rst index 5424595..072b7c3 100644 --- a/CCPPtechnical/source/HostSideCoding.rst +++ b/CCPPtechnical/source/HostSideCoding.rst @@ -14,7 +14,7 @@ All variables required to communicate between the host model and the physics, as At present, only two types of variable definitions are supported by the CCPP Framework: -* Standard Fortran variables (character, integer, logical, real) defined in a module or in the main program. For character variables, a fixed length is required. All others can have a kind attribute of a kind type defined by the host model. +* Standard Fortran variables (character, integer, logical, real) defined in a module or in the main program. For character variables, a fixed length is required. All others can have a kind attribute of a kind type defined by the host model. Pointers are not allowed as passable CCPP variables (though they may still be used internally by individual schemes). * Derived data types (DDTs) defined in a module or the main program. While the use of DDTs as arguments to physics schemes in general is discouraged (see :numref:`Section %s `), it is perfectly acceptable for the host model to define the variables requested by physics schemes as components of DDTs and pass these components to CCPP by using the correct local_name (e.g., ``myddt%thecomponentIwant``; see :numref:`Section %s `.) .. _VariableTablesHostModel: @@ -543,7 +543,7 @@ A more complicated example is when multiple ``cdata`` structures are in use, nam end do end do -*Listing 6.6: A morre complex suite initialization step that consists of allocating and initializing multiple ``cdata`` structures.* +*Listing 6.6: A more complex suite initialization step that consists of allocating and initializing multiple ``cdata`` structures.* Depending on the implementation of CCPP in the host model, the suite name for the suite to be executed must be set in this step as well (omitted in Listing :ref:`Listing 6.6 `). diff --git a/CCPPtechnical/source/Introduction.rst b/CCPPtechnical/source/Introduction.rst index ca940e9..6973089 100644 --- a/CCPPtechnical/source/Introduction.rst +++ b/CCPPtechnical/source/Introduction.rst @@ -12,7 +12,7 @@ This document contains documentation for the Common Community Physics Package (: * Host-side coding * CCPP code management and governance * Parameterization-specific output -* Debugging strategies +* Debugging strategies The following table describes the type changes and symbols used in this guide. @@ -23,18 +23,18 @@ The following table describes the type changes and symbols used in this guide. - Meaning - Examples * - ``AaBbCc123`` - - + - * The names of commands, files, and directories * On-screen terminal output - - - * Edit your ``.bashrc`` file - * Use ``ls -a`` to list all files. + - + * Edit your ``.bashrc`` file + * Use ``ls -a`` to list all files. * ``host$ You have mail!`` * - *AaBbCc123* - - + - * The names of CCPP-specific terms, subroutines, etc. * Captions for figures, tables, etc. - - + - * Each scheme must include at least one of the following subroutines: ``{schemename}_timestep_init``, ``{schemename}_init``, ``{schemename}_run``, ``{schemename}_finalize``, and ``{schemename}_timestep_finalize``. * *Listing 2.1: Fortran template for a CCPP-compliant scheme showing the* _run *subroutine.* * - **AaBbCc123** @@ -50,12 +50,14 @@ will be presented in this style: Some CCPP-specific terms will be highlighted using *italics*, and words requiring particular emphasis will be highlighted in **bold** text. -In some places there are helpful asides or warnings that the user should pay attention to; these +In some places there are helpful asides or warnings that the user should pay attention to; these will be presented in the following style: .. note:: - This is an important point that should **not** be ignored! + This is a helpful aside that may or may not be relevant to your specific use case. +.. warning:: + This is an important point that should **not** be ignored! In several places in the technical documentation, we need to refer to locations of files or directories in the source code. Since the directory structure depends on the :term:`host model`, in particular the directories where the ``ccpp-framework`` and ``ccpp-physics`` source code is checked out, and the directory from which the ``ccpp_prebuild.py`` code generator is called, we use the following convention: diff --git a/CCPPtechnical/source/Overview.rst b/CCPPtechnical/source/Overview.rst index b3f57e1..520810c 100644 --- a/CCPPtechnical/source/Overview.rst +++ b/CCPPtechnical/source/Overview.rst @@ -38,11 +38,11 @@ for Atmospheric Research (:term:`NCAR`), the Navy, National Oceanic and Atmosphe (NOAA) Research Laboratories, NOAA National Weather Service, and other groups. Physics interoperability, or the ability to run a given physics :term:`suite` in various :term:`host models `, has been a goal of this multi-agency group for several years. An initial mechanism to -run the physics of NOAA’s Global Forecast System (GFS) model in other host models, +run the physics of NOAA’s :term:`Global Forecast System (GFS) ` model in other host models, the Interoperable Physics Driver (IPD), was developed by the NOAA Environmental Modeling Center (EMC) and later augmented by the NOAA Geophysical Fluid Dynamics Laboratory (GFDL). -The CCPP expanded on that work by meeting `additional requirements put forth by NOAA `_, +The CCPP expanded on that work by meeting `additional requirements put forth by NOAA `_, and brought new functionalities to the physics-dynamics interface. Those include the ability to choose the order of :term:`parameterizations `, to :term:`subcycle ` individual parameterizations by running them more frequently than other parameterizations, @@ -78,11 +78,14 @@ is exchanged among parameterizations. During runtime, the CCPP Framework is resp communicating the necessary variables between the host model and the parameterizations. The CCPP Physics contains the parameterizations and suites that are used operationally in -the UFS Atmosphere, as well as parameterizations that are under development for possible +the UFS Atmosphere, through the `Hurricane Analysis and Forecast System (HAFS) `_, +the `NOAA-EPA National Air Quality Forecast Capability (NAQFC) `_, +and pre-operational prototypes of GFS version 17, planned for implementation in 2025. Additionally, the CCPP +contains dozens of suites and parameterizations that are used for both research and development for possible transition to operations in the future. The CCPP aims to support the broad community while benefiting from the community. In such a CCPP ecosystem (:numref:`Figure %s `), the CCPP can be used not only by the operational -centers to produce operational forecasts, but also by the research community to conduct +centers to produce official forecasts, but also by the research community to conduct investigation and development. Innovations created and effectively tested by the research community can be funneled back to the operational centers for further improvement of the operational forecasts. @@ -102,77 +105,83 @@ This documentation is housed in repository https://github.com/NCAR/ccpp-doc. The CCPP is governed by the groups that contribute to its development. The CCPP Physics code management is collaboratively determined by NOAA, NCAR, and the Navy Research Laboratory (NRL), and the DTC works with EMC and its sponsors to determine :term:`schemes ` -and suites to be included and supported. The governance of the CCPP Framework is jointly -undertaken by NOAA and NCAR (see more information at https://github.com/NCAR/ccpp-framework/wiki -and https://dtcenter.org/community-code/common-community-physics-package-ccpp). +and suites to be included and supported; criteria for inclusion for new schemes can be found in :numref:`Chapter %c `. +The governance of the CCPP Framework is a separate group, also a collaboration among NOAA, NCAR, and NRL. +For more information about code management and governance, see https://github.com/NCAR/ccpp-framework/wiki +and https://dtcenter.org/community-code/common-community-physics-package-ccpp. The table below lists all parameterizations supported in CCPP public releases and the -`CCPP Scientific Documentation `_ +`CCPP Scientific Documentation `_ describes the parameterizations in detail. The parameterizations -are grouped in suites, which can be classified primarily as *operational* or *developmental*. -*Operational* suites are those used by operational, real-time weather prediction models. For this release, the only operational suite is GFS_v16, which is used for `version 16 `_ of the GFS model. -*Developmental* suites are those that are officially supported for this CCPP release with one or more host models, but are not currently used in any operational models. These may include schemes needed exclusively for research, or "release candidate" schemes proposed for use with future operational models. +are grouped in :term:`suites `, which determine the order and number of times each scheme is run within the host. .. _scheme_suite_table: -.. table:: *Suites supported in the CCPP* - - +---------------------+------------------+------------------+----------------+----------------+----------------+----------------+ - | | :bi:`Operational`| :gbi:`Developmental` | - +---------------------+------------------+------------------+----------------+----------------+----------------+----------------+ - | Physics Suite | GFS_v16 | :g:`GFS_v17_p8` | :g:`RAP` |:g:`RRFS_v1beta`| :g:`WoFS` | :g:`HRRR` | - +=====================+==================+==================+================+================+================+================+ - | **Supported hosts** | **SCM/SRW** | :gb:`SCM` | :gb:`SCM` |:gb:`SCM/SRW` | :gb:`SCM/SRW` | :gb:`SCM/SRW` | - +---------------------+------------------+------------------+----------------+----------------+----------------+----------------+ - | Microphysics | GFDL | :g:`Thompson` | :g:`Thompson` | :g:`Thompson` | :g:`NSSL` | :g:`Thompson` | - +---------------------+------------------+------------------+----------------+----------------+----------------+----------------+ - | PBL | TKE EDMF | :g:`TKE EDMF` | :g:`MYNN-EDMF` | :g:`MYNN-EDMF` | :g:`MYNN-EDMF` | :g:`MYNN-EDMF` | - +---------------------+------------------+------------------+----------------+----------------+----------------+----------------+ - | Deep convection | saSAS | :g:`saSAS + CA` | :g:`GF` | :gi:`N/A` | :gi:`N/A` | :gi:`N/A` | - +---------------------+------------------+------------------+----------------+----------------+----------------+----------------+ - | Shallow convection | saMF | :g:`saMF` | :g:`GF` | :gi:`N/A` | :gi:`N/A` | :gi:`N/A` | - +---------------------+------------------+------------------+----------------+----------------+----------------+----------------+ - | Radiation | RRTMG | :g:`RRTMG` | :g:`RRTMG` | :g:`RRTMG` | :g:`RRTMG` | :g:`RRTMG` | - +---------------------+------------------+------------------+----------------+----------------+----------------+----------------+ - | Surface layer | GFS | :g:`GFS` | :g:`MYNN-SFL` | :g:`MYNN-SFL` | :g:`MYNN-SFL` | :g:`MYNN-SFL` | - +---------------------+------------------+------------------+----------------+----------------+----------------+----------------+ - | Gravity Wave Drag | CIRES-uGWP | :g:`Unified-uGWP`| :g:`GSL drag` | :g:`CIRES-uGWP`| :g:`CIRES-uGWP`| :g:`GSL drag` | - +---------------------+------------------+------------------+----------------+----------------+----------------+----------------+ - | Land surface | Noah | :g:`Noah-MP` | :g:`RUC` | :g:`Noah-MP` | :g:`Noah-MP` | :g:`RUC` | - +---------------------+------------------+------------------+----------------+----------------+----------------+----------------+ - | Ozone | NRL 2015 | :g:`NRL 2015` | :g:`NRL 2015` | :g:`NRL 2015` | :g:`NRL 2015` | :g:`NRL 2015` | - +---------------------+------------------+------------------+----------------+----------------+----------------+----------------+ - | Strat H\ :sub:`2`\ O| NRL 2015 | :g:`NRL 2015` | :g:`NRL 2015` | :g:`NRL 2015` | :g:`NRL 2015` | :g:`NRL 2015` | - +---------------------+------------------+------------------+----------------+----------------+----------------+----------------+ - | Ocean | NSST | :g:`NSST` | :g:`NSST` | :g:`NSST` | :g:`NSST` | :g:`NSST` | - +---------------------+------------------+------------------+----------------+----------------+----------------+----------------+ - -Only the suites that are currently supported in the CCPP are listed in the table. Currently all supported suites use the 2015 Navy Research Laboratory (NRL) `ozone `_ and `stratospheric water vapor `_ schemes, +.. table:: *Suites supported in the CCPP v7.0 release* + + +---------------------+------------+----------------+-----------------+-----------+-----------+ + | Physics Suite | GFS_v16 | GFS_v16_RRTMGP |GFS_v17_p8_ugwpv1| WoFS_v0 | HRRR_gf | + +=====================+============+================+=================+===========+===========+ + | Microphysics | GFDL | GFDL | Thompson | NSSL | Thompson | + +---------------------+------------+----------------+-----------------+-----------+-----------+ + | PBL | TKE EDMF | TKE EDMF | TKE EDMF | MYNN-EDMF | MYNN-EDMF | + +---------------------+------------+----------------+-----------------+-----------+-----------+ + | Deep convection | saSAS | saSAS | saSAS | :gi:`N/A` | GF | + +---------------------+------------+----------------+-----------------+-----------+-----------+ + | Shallow convection | saMF | saMF | saMF | :gi:`N/A` | :gi:`N/A` | + +---------------------+------------+----------------+-----------------+-----------+-----------+ + | Radiation | RRTMG | RRTMGP | RRTMG | RRTMG | RRTMG | + +---------------------+------------+----------------+-----------------+-----------+-----------+ + | Surface layer | GFS | GFS | GFS | MYNN-SFL | MYNN-SFL | + +---------------------+------------+----------------+-----------------+-----------+-----------+ + | Gravity Wave Drag | CIRES-uGWP | CIRES-uGWP | Unified-uGWP | CIRES-uGWP| Orographic| + +---------------------+------------+----------------+-----------------+-----------+-----------+ + | Land surface | Noah | Noah | Noah-MP | Noah-MP | RUC | + +---------------------+------------+----------------+-----------------+-----------+-----------+ + | Ozone | NRL 2015 | NRL 2015 | NRL 2015 | NRL 2015 | NRL 2015 | + +---------------------+------------+----------------+-----------------+-----------+-----------+ + | Strat H\ :sub:`2`\ O| NRL 2015 | NRL 2015 | NRL 2015 | NRL 2015 | NRL 2015 | + +---------------------+------------+----------------+-----------------+-----------+-----------+ + | Ocean | NSST | NSST | NSST | NSST | NSST | + +---------------------+------------+----------------+-----------------+-----------+-----------+ + +Only the suites that are currently supported by the CCPP team in the DTC are listed in the table. These suites were tested in the context of the Single-Column Model to verify that they work without error and produce realistic results for this release version; this is in addition to any testing that may occur in other dynamical cores by other groups. +Currently all supported suites use the 2015 Navy Research Laboratory (NRL) `ozone `_ and `stratospheric water vapor `_ schemes, and the `NSST `_ ocean scheme. -The operational GFS_v16 suite includes `GFDL microphysics `_, -the `Turbulent Kinetic Energy (TKE)-based Eddy Diffusivity Mass-Flux (EDMF) `_ planetary boundary layer (PBL) scheme, -`scale-aware (sa) Simplified Arakawa-Schubert (SAS) `_ deep convection, -`scale-aware mass-flux (saMF) `_ shallow convection, -`Rapid Radiation Transfer Model for General Circulation Models (RRTMG) `_ radiation, -`GFS surface layer `_ scheme, -the `Cooperative Institute for Research in the Environmental Sciences (CIRES) unified gravity wave drag (uGWD) `_ scheme, -and the `Noah Land Surface Model (LSM) `_. +The GFS_v16 suite is meant to emulate the physics used by `version 16 `_ of the GFS model. It includes `GFDL microphysics `_, +the `Turbulent Kinetic Energy (TKE)-based Eddy Diffusivity Mass-Flux (EDMF) `_ planetary boundary layer (PBL) scheme, +`scale-aware (sa) Simplified Arakawa-Schubert (SAS) `_ deep convection, +`scale-aware mass-flux (saMF) `_ shallow convection, +`Rapid Radiation Transfer Model for General Circulation Models (RRTMG) `_ radiation, +`GFS surface layer `_ scheme, +the `Cooperative Institute for Research in the Environmental Sciences (CIRES) unified gravity wave drag (uGWD) `_ scheme, +and the `Noah Land Surface Model (LSM) `_. -The five developmental suites are either analogues for current operational physics schemes, or candidates for future operational implementations. +GFS_v16_RRTMGP is identical to the GFS_v16 suite, but with the `RRTMGP `_ radiation schemes rather than RRTMG. -* The GFS_v17_p8 suite is the current (as of June 2022) proposed suite for the next operational GFS implementation (version 17), and features several differences from the GFS_v16 suite, using `Thompson `_ microphysics, `saSAS plus Cellular Automata (CA) `_ deep convection, `Unified uGWP `_ gravity wave drag, and `Noah Multiparameterization (Noah-MP) `_ land surface parameterization. +The GFS_v17_p8_ugwpv1 suite is the latest (as of July 2024) proposed suite for the next operational GFS implementation (version 17), and features several differences from the GFS_v16 suite. +GFS_v17_p8_ugwpv1 utilizes `Thompson `_ Aerosol-Aware microphysics, `Unified uGWP `_ gravity wave drag, and `Noah Multiparameterization (Noah-MP) `_ land surface parameterization. -* The RAP scheme is similar to the operational Rapid Refresh (RAP) model physics package, and features Thompson microphysics, `Mellor-Yamada-Nakanishi-Niino (MYNN) EDMF `_ PBL, `Grell-Freitas (GF) `_ deep convection and shallow convection schemes, RRTMG radiation, `MYNN surface layer (SFL) `_ scheme, `Global Systems Laboratory (GSL) `_ gravity wave drag scheme, and the `Rapid Update Cycle (RUC) Land Surface Model `_. +The WoFS_v0 suite has been used by the `Warn-on-Forecast System (WoFS) `_ project at the National Severe Storms Laboratory (NSSL) for real-time and potential future operational high-resolution modeling products. +This suite features `NSSL 2-moment `_ microphysics, `Mellor-Yamada-Nakanishi-Niino (MYNN) eddy diffusivity-mass flux (EDMF) `_ PBL, RRTMG radiation, `MYNN surface layer (SFL) `_ scheme, CIRES uGWD, and Noah-MP land surface (it does not feature convective parameterization). -* The RRFS_v1beta suite is being used for development of the future `Rapid Refresh Forecast System (RRFS) `_, which is scheduled for implementation in late 2023. This scheme features Thompson microphysics, MYNN EDMF PBL, RRTMG radiation, MYNN SFL, CIRES uGWD, and Noah-MP land surface (it does not feature convective parameterization). +Finally, the HRRR_gf suite was developed for use with prototypes of the `Rapid Refresh Forecast System (RRFS) `_, and is similar to the physics used in the operational High-Resolution Rapid Refresh (HRRR) model physics package except with the implementation of deep convective parameterization. +This suite features Thompson Aerosol-Aware microphysics, MYNN-EDMF PBL physics, `Grell-Freitas deep convection `_, RRTMG radiation, MYNN SFL, the `Orographic Drag Scheme `_ for gravity waves, and the `Rapid Update Cycle (RUC) Land Surface Model `_. -* The `Warn-on-Forecast System (WoFS) `_ suite is being used by the WoFS project at the National Severe Storms Laboratory (NSSL) for real-time and potential future operational high-resolution modeling products. The WoFS suite is identical to the RRFS_v1beta suite, except using `NSSL 2-moment `_ microphysics. +In addition to the supported schemes listed above, there are several suites being used in various supported UFS applications, such as the UFS SRW Application and HAFS. While the CCPP team does not actively support these suites, support may be available from those respective applications where those suites are used. -* Finally, the HRRR scheme is similar to the operational High-Resolution Rapid Refresh (HRRR) model physics package, and is identical to the RAP scheme except it does not have convective parameterization due to its intended use at higher convective-permitting resolutions. +.. [#] As of this writing, the CCPP is used and regularly tested with three host models: the CCPP + SCM, the atmospheric component of NOAA’s Unified Forecast System (UFS) (hereafter the UFS Atmosphere) that utilizes + the Finite-Volume Cubed Sphere (FV3) dynamical core, and the Navy Environmental Prediction sysTem Using a Nonhydrostatic Engine (NEPTUNE). + The CCPP can be utilized with both + global and limited-area configurations of the UFS Atmosphere, and is integrated with the UFS limited-area UFS Short-Range Weather (SRW) Application and the Hurricane Analysis and Forecast System (HAFS). + Work is also underway to connect and validate the use of the CCPP Framework with NCAR models, particularly the Community Atmosphere Model - System for Integrated Modeling of the Atmosphere (CAM-SIMA). -Those interested in the history of previous CCPP releases should know that the -first public release of the CCPP took place in April 2018 and included all the +Previous CCPP releases +======================= + +The first public release of the CCPP took place in April 2018 and included all the parameterizations of the operational GFS v14, along with the ability to connect to the SCM. The second public release of the CCPP took place in August 2018 and additionally included the physics suite tested for the implementation of GFS v15. The third public release of @@ -180,6 +189,7 @@ the CCPP, in June 2019, had four suites: GFS_v15, corresponding to the GFS v15 m in June 2019, and three developmental suites considered for use in GFS v16 (GFS_v15plus with an alternate PBL scheme, csawmg with alternate convection and microphysics schemes, and GFS_v0 with alternate convection, microphysics, PBL, and land surface schemes). + The CCPP v4.0 release, issued in March 2020, contained suite GFS_v15p2, which is an updated version of the operational GFS v15 and replaced suite GFS_v15. It also contained three developmental suites: @@ -189,18 +199,13 @@ upcoming operational GFSv16 (it replaced suite GFSv15plus). CCPP v4.0 was the fi The CCPP v4.1 release, issued in October 2020, was a minor upgrade with the capability to build the code using Python 3 (previously only Python 2 was supported). + The CCPP v5.0 release, issued in February 2021, was a major upgrade to enable use with the UFS Short-Range Weather (SRW) Application and the RRFS_v1alpha suite. -The CCPP v6.0 release, issued in June 2022, was a major upgrade in conjunction with the release of the UFS SRW v2.0 release. - -.. [#] As of this writing, the CCPP has been validated with two host models: the CCPP - SCM and the atmospheric component of - NOAA’s Unified Forecast System (UFS) (hereafter the UFS Atmosphere) that utilizes - the Finite-Volume Cubed Sphere (FV3) dynamical core. The CCPP can be utilized both with the - global and limited-area configurations of the UFS Atmosphere. CCPP v6.0.0 is the latest - release compatible with the UFS limited-area UFS SRW Application. The CCPP - has also been run experimentally with a Navy model. Work is under - way to connect and validate the use of the CCPP Framework with NCAR models. + +The CCPP v6.0 release, issued in June 2022, was a major upgrade in conjunction with the `UFS SRW v2.0 release `_. + +The CCPP v7.0 release, issued in August 2024, was a major upgrade to physics and the Single-Column Model, particularly with the inclusion of the new `Case Generator `_ capability for running the SCM from UFS output. Additional Resources ======================== diff --git a/CCPPtechnical/source/ParamSpecificOutput.rst b/CCPPtechnical/source/ParamSpecificOutput.rst index b8782dd..0b11a6b 100644 --- a/CCPPtechnical/source/ParamSpecificOutput.rst +++ b/CCPPtechnical/source/ParamSpecificOutput.rst @@ -24,17 +24,15 @@ implementation are that memory is only allocated for the necessary positions of diagnostics are output on physics model levels. An extension to enable output on radiation levels may be considered in future implementations. -These capabilities have been tested and are expected to work with the following suites: - -* SCM: GFS_v16, GFS_v17_p8, RAP, RRFS_v1beta, WoFS, HRRR -* ufs-weather-model (regional): GFS_v16, RRFS_v1beta, WoFS, HRRR +These capabilities of the SCM have been tested and are expected to work with the suites +``SCM_GFS_v16``, ``SCM_GFS_v16_RRTMGP``, ``SCM_GFS_v17_p8_ugwpv1``, ``SCM_WoFS_v0``, and ``SCM_HRRR_gf``. ========== Tendencies ========== This section describes the tendencies available, how to set the model to prepare them and how to output -them. It also contains a list of frequently-asked questions in :numref:`Section %s `. +them. It also contains a list of frequently-asked questions in :numref:`Section %s `. Available Tendencies -------------------- @@ -56,7 +54,7 @@ photochemistry. The total tendency produced by the ozone photochemistry scheme ( subdivided by subprocesses: production and loss (combined as a single subprocess), quantity of ozone present in the column above a grid cell, influences from temperature, and influences from mixing ratio. For more information about the NRL 2015 ozone photochemistry scheme, consult the `CCPP Scientific Documentation -`_. +`_. There are numerous tendencies in CCPP, and you need to know which ones exist for your configuration to enable them. The model will output a list of available tendencies for your configuration if you run with @@ -69,7 +67,7 @@ Enabling Tendencies For performance reasons, the preparation of tendencies for output is off by default in the UFS and can be turned on via a set of namelist options. Since the SCM is not operational and has a relatively -tiny memory footprint, these tendencies are turned on by default in the SCM. +tiny memory footprint, these tendencies are turned on by default in the SCM. There are three namelist variables associated with this capability: ``ldiag3d``, ``qdiag3d``, and ``dtend_select``. These are set in the ``&gfs_physics_nml`` portion of the namelist file ``input.nml``. @@ -87,7 +85,7 @@ value used in the namelist is irrelevant. While the tendencies output by the SCM are instantaneous, the tendencies output by the UFS are averaged over the number of hours specified by the user in variable ``fhzero`` in the ``&gfs_physics_nml`` portion of the -namelist file ``input.nml``. Variable ``fhzero`` must be an integer (it cannot be zero). +namelist file ``input.nml``. Variable ``fhzero`` must be an integer (it cannot be zero). This example namelist selects all tendencies from microphysics processes, and all tendencies of temperature. The naming convention for ``dtend_select`` is explained in the next section. @@ -314,7 +312,7 @@ non-physics tendencies are in the ``gfs_dyn`` module. This is reflected in the : Note that some :term:`host models `, such as the UFS, have a limit of how many fields can be output in a run. When outputting all tendencies, this limit may have to be increased. In the UFS, this limit is determined -by variable ``max_output_fields`` in namelist section ``&diag_manager_nml`` in file ``input.nml``. +by variable ``max_output_fields`` in namelist section ``&diag_manager_nml`` in file ``input.nml``. Further documentation of the ``diag_table`` file can be found in the `UFS Weather Model User’s Guide `_. @@ -337,13 +335,13 @@ What is the meaning of error message ``max_output_fields`` was exceeded? ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ If the limit to the number of output fields is exceeded, the job may fail with the following message: - + .. code-block:: console FATAL from PE 24: diag_util_mod::init_output_field: max_output_fields = 300 exceeded. Increase via diag_manager_nml - + In this case, increase ``max_output_fields`` in ``input.nml``: - + .. code-block:: console &diag_manager_nml @@ -360,11 +358,11 @@ Why did I get a runtime logic error when outputting tendencies? --------------------------------------------------------------- Setting ``ldiag3d=F`` and ``qdiag3d=T`` will result in an error message: - + .. code-block:: console Logic error in GFS_typedefs.F90: qdiag3d requires ldiag3d - + If you want to output tracer tendencies, you must set both ``ldiag3d`` and ``qdiag3d`` to T. Then use ``diag_select`` to enable only the tendencies you want. Make sure your ``diag_table`` matches your choice of tendencies specified through ``diag_select``. @@ -374,7 +372,7 @@ Why are my tendencies zero, even though the model says they are supported for my For total physics or total photochemistry tendencies, see the next question. The tendencies will be zero if they are never calculated. Check that you enabled the tendencies with -appropriate settings of ``ldiag3d``, ``qdiag3d``, and ``diag_select``. +appropriate settings of ``ldiag3d``, ``qdiag3d``, and ``diag_select``. Another possibility is that the tendencies in question really are zero. The list of "available" tendencies is set at the model level, where the exact details of schemes and suites are not known. This can lead to @@ -429,7 +427,7 @@ The UFS and SCM already contain code to declare and initialize the arrays: * arrays are populated in ``GFS_diagnostics.F90`` (UFS) or ``scm_output.F90`` (SCM) The remainder of this section describes changes the developer needs to make in the -physics code and in the host model control files to enable the capability. An +physics code and in the host model control files to enable the capability. An example (:numref:`Section %s `) and FAQ (:numref:`Section %s `) are also provided. @@ -480,7 +478,7 @@ For the SCM, these arrays are implicitly 1D and 2D, respectively, where the “y and the “x” dimension represents the number of independent columns (typically also 1). For continuity with the UFS Atmosphere, the naming convention 2D and 3D are retained, however. With this understanding, the namelist files can be modified as in the UFS: - + * Namelist file ``input.nml`` * Specify how many 2D and 3D arrays will be output using variables ``naux2d`` and ``naux3d`` in section ``&gfs_physics_nml``, respectively. The maximum allowed number of arrays to @@ -498,67 +496,66 @@ Recompiling and Examples The developer must recompile the code after making the source code changes to the CCPP scheme(s) and associated metadata files. Changes in the namelist and diag table can be made after compilation. At compile and runtime, the developer must pick suites that use the scheme from which output is desired. - + An example for how to output auxiliary arrays is provided in the rest of this section. The lines that start with “+” represent lines that were added by the developer to output the diagnostic arrays. In this example, the developer modified the Grell-Freitas (GF) cumulus scheme to output two 2D arrays and one 3D array. The 2D arrays are ``aux_2d (:,1)`` and ``aux_2d(:,2)``; the 3D array is ``aux_3d(:,:,1)``. The 2D array ``aux2d(:,1)`` will be output with an averaging in time in the UFS, while the ``aux2d(:,2)`` -and ``aux3d`` arrays will not be averaged. +and ``aux3d`` arrays will not be averaged. In this example, the arrays are populated with bogus information just to demonstrate the capability. -In reality, a developer would populate the array with the actual quantity for which output is desirable. +In reality, a developer would populate the array with the actual quantity for which output is desirable. .. code-block:: console - diff --git a/physics/cu_gf_driver.F90 b/physics/cu_gf_driver.F90 - index 927b452..aed7348 100644 - --- a/physics/cu_gf_driver.F90 - +++ b/physics/cu_gf_driver.F90 - @@ -76,7 +76,8 @@ contains - flag_for_scnv_generic_tend,flag_for_dcnv_generic_tend, & - du3dt_SCNV,dv3dt_SCNV,dt3dt_SCNV,dq3dt_SCNV, & - du3dt_DCNV,dv3dt_DCNV,dt3dt_DCNV,dq3dt_DCNV, & - - ldiag3d,qdiag3d,qci_conv,errmsg,errflg) - + ldiag3d,qdiag3d,qci_conv,errmsg,errflg, & - + naux2d,naux3d,aux2d,aux3d) + diff --git a/physics/CONV/Grell_Freitas/cu_gf_driver.F90 b/physics/CONV/Grell_Freitas/cu_gf_driver.F90 + index df5a196b..a4fb7c1a 100644 + --- a/physics/CONV/Grell_Freitas/cu_gf_driver.F90 + +++ b/physics/CONV/Grell_Freitas/cu_gf_driver.F90 + @@ -68,7 +68,7 @@ contains + dfi_radar_max_intervals,ldiag3d,qci_conv,do_cap_suppress, & + maxupmf,maxMF,do_mynnedmf,ichoice_in,ichoicem_in,ichoice_s_in, & + spp_cu_deep,spp_wts_cu_deep,nchem,chem3d,fscav,wetdpc_deep, & + - do_smoke_transport,kdt,errmsg,errflg) + + do_smoke_transport,kdt,errmsg,errflg,naux2d,naux3d,aux2d,aux3d) !------------------------------------------------------------- implicit none integer, parameter :: maxiens=1 - @@ -137,6 +138,11 @@ contains - integer, intent(in ) :: imfshalcnv + @@ -167,6 +167,11 @@ contains character(len=*), intent(out) :: errmsg integer, intent(out) :: errflg - + + + integer, intent(in) :: naux2d,naux3d + real(kind_phys), intent(inout) :: aux2d(:,:) + real(kind_phys), intent(inout) :: aux3d(:,:,:) + - ! define locally for now. - integer, dimension(im),intent(inout) :: cactiv + + + ! local variables integer, dimension(im) :: k22_shallow,kbcon_shallow,ktop_shallow - @@ -199,6 +205,11 @@ contains - ! initialize ccpp error handling variables + real(kind=kind_phys), dimension (im) :: rand_mom,rand_vmas + @@ -261,6 +266,10 @@ contains errmsg = '' errflg = 0 - + + + aux2d(:,1) = aux2d(:,1) + 1 + aux2d(:,2) = aux2d(:,2) + 2 + aux3d(:,:,1) = aux3d(:,:,1) + 3 + - ! - ! Scale specific humidity to dry mixing ratio - ! + ichoice = ichoice_in + ichoicem = ichoicem_in + ichoice_s = ichoice_s_in The ``cu_gf_driver.meta`` file was modified accordingly: .. code-block:: console - diff --git a/physics/cu_gf_driver.meta b/physics/cu_gf_driver.meta - index 99e6ca6..a738721 100644 - --- a/physics/cu_gf_driver.meta - +++ b/physics/cu_gf_driver.meta - @@ -476,3 +476,29 @@ + diff --git a/physics/CONV/Grell_Freitas/cu_gf_driver.meta b/physics/CONV/Grell_Freitas/cu_gf_driver.meta + index f76d0c30..1053325d 100644 + --- a/physics/CONV/Grell_Freitas/cu_gf_driver.meta + +++ b/physics/CONV/Grell_Freitas/cu_gf_driver.meta + @@ -687,3 +687,29 @@ + dimensions = () type = integer intent = out +[naux2d] @@ -586,10 +583,9 @@ The ``cu_gf_driver.meta`` file was modified accordingly: + units = none + dimensions = (horizontal_loop_extent,vertical_layer_dimension,number_of_3d_auxiliary_arrays) + type = real - + kind = kind_phys The following lines were added to the ``&gfs_physics_nml`` section of the namelist file ``input.nml``: - + .. code-block:: console naux2d = 2 @@ -597,15 +593,15 @@ The following lines were added to the ``&gfs_physics_nml`` section of the nameli aux2d_time_avg = .true., .false. Recall that for the SCM, ``aux2d_time_avg`` should not be set to true in the namelist. - + Lastly, the following lines were added to the ``diag_table`` for UFS: - + .. code-block:: console # Auxiliary output "gfs_phys", "aux2d_01", "aux2d_01", "fv3_history2d", "all", .false., "none", 2 "gfs_phys", "aux2d_02", "aux2d_02", "fv3_history2d", "all", .false., "none", 2 - "gfs_phys", "aux3d_01", "aux3d_01", "fv3_history", "all", .false., "none", + "gfs_phys", "aux3d_01", "aux3d_01", "fv3_history", "all", .false., "none", .. _AuxArrayFAQ: @@ -618,4 +614,4 @@ How do I enable the output of diagnostic arrays from multiple parameterizations Suppose you want to output two 2D arrays from schemeA and two 2D arrays from schemeB. You should set the namelist to ``naux2d=4`` and ``naux3d=0``. In the code for schemeA, you should populate ``aux2d(:,1)`` and ``aux2d(:,2)``, while in the code for scheme B you should populate ``aux2d(:,3)`` -and ``aux2d(:,4)``. +and ``aux2d(:,4)``. diff --git a/CCPPtechnical/source/ScientificDocRules.inc b/CCPPtechnical/source/ScientificDocRules.inc index 2a140e5..8a312d1 100644 --- a/CCPPtechnical/source/ScientificDocRules.inc +++ b/CCPPtechnical/source/ScientificDocRules.inc @@ -23,7 +23,7 @@ so that doxygen will parse them correctly, where to put various comments within the code, how to include information from the ``.meta`` files, and how to configure and run doxygen to generate HTML output. For an example of the HTML rendering of the CCPP Scientific Documentation, see -https://dtcenter.ucar.edu/GMTB/v6.0.0/sci_doc/index.html +https://dtcenter.ucar.edu/GMTB/v7.0.0/sci_doc/index.html Part of this documentation, namely metadata about subroutine arguments, has functional significance as part of the CCPP infrastructure. The metadata must be in a particular format to be parsed by Python scripts that “automatically” generate @@ -107,7 +107,7 @@ a project or suite. You can refer to any source code entity from within a page. The DTC maintains a main page, created by the Doxygen command ``\mainpage``, which contains an overall description and background of the CCPP. -Physics developers do not have to edit the file with the mainpage (``mainpage.txt``), which is +Physics developers do not have to edit the file with the mainpage (``mainpage.txt``), which is formatted like this: .. code-block:: console @@ -120,20 +120,17 @@ formatted like this: All other pages listed under the main page are created using the Doxygen tag ``\page`` described in the next section. In any Doxygen page, you can refer to any entity of source code by using Doxygen tag ``\ref`` -or ``@ref``. Example from ``suite_FV3_GFS_v16.txt``: +or ``@ref``. Example from ``GFS_v16_suite.txt``: .. code-block:: console /** \page GFS_v16_page GFS_v16 Suite - + \section gfsv16_suite_overview Overview - + Version 16 of the Global Forecast System (GFS) was implemented operationally by the NOAA - National Centers for Environmental Prediction (NCEP) in 2021. This suite is available for - use with the UFS SRW App and with the CCPP SCM. - - The GFS_v16 suite uses the parameterizations in the following order: + National Centers for Environmental Prediction (NCEP) in 2021. The GFS_v16 suite uses the parameterizations in the following order: - \ref GFS_RRTMG - \ref GFS_SFCLYR - \ref GFS_NSST @@ -147,15 +144,15 @@ or ``@ref``. Example from ``suite_FV3_GFS_v16.txt``: - \ref GFS_SAMFdeep - \ref GFS_SAMFshal - \ref GFDL_cloud - + \section sdf_gfsv16b Suite Definition File \include suite_FV3_GFS_v16.xml ... */ -The HTML result of this Doxygen code `can be viewed here `_. -You can see that the ``-`` symbols at the start of a line generate a list with bullets, and the -``\ref`` commands generate links to the appropriately labeled pages. The ``\section`` comands +The HTML result of this Doxygen code `can be viewed here `_. +You can see that the ``-`` symbols at the start of a line generate a list with bullets, and the +``\ref`` commands generate links to the appropriately labeled pages. The ``\section`` comands indicate section breaks, and the ``\include`` commands will include the contents of another file. Other valid Doxygen commands for style, markup, and other functionality can be found in the `Doxygen documentation `_. @@ -168,7 +165,7 @@ overview of the parameterization. These pages are not tied to the Fortran code directly; instead, they are created with a separate text file that starts with the command ``\page``. For CCPP, the stand-alone Doxygen pages, including the main page and the scheme pages, are contained in the *ccpp-physics* repository, under the -``ccpp-physics/physics/docs/pdftxt/`` directory. Each page (aside from the main page) has a +``ccpp-physics/physics/docs/pdftxt/`` directory. Each page (aside from the main page) has a *label* (e.g., “GFS_SAMFdeep” in the following example) and a user-visible title (“GFS Scale-Aware Simplified Arakawa-Schubert (sa-SAS) Deep Convection Scheme” in the following example). It is noted that labels must be unique @@ -222,14 +219,14 @@ The physics scheme page will often describe the following: The argument table for CCPP entry point subroutine ``{scheme}_run`` will be in this section. It is created by inserting a reference link (``\ref``) to the corresponding Doxygen label in the Fortran code for the scheme. In the :ref:`above example `, the ``\ref arg_table_samfdeepcnv_run`` - tag references the section of Doxygen-annotated source code in ``ccpp-physics/physics/samfdeepcnv.f`` - that contains the scheme's argument table as an included html document, as described in the + tag references the section of Doxygen-annotated source code in `ccpp-physics/physics/CONV/SAMF/samfdeepcnv.f `__ + that contains the scheme's argument table as an included html document, as described in the :ref:`following section `. 3. A "General Algorithm" section The general description of the algorithm will be in this section. It is created by inserting - a reference link (``\ref``) pointing to the corresponding Doxygen-annotated source code for the scheme, + a reference link (``\ref``) pointing to the corresponding Doxygen-annotated source code for the scheme, as described in the :ref:`following section `. As can be seen in the above examples, symbols ``/\*\*`` and ``*/`` need to be the first and last entries of the page. @@ -251,7 +248,7 @@ is used to aggregate all code related to that scheme, even when it is in separat files. Since doxygen cannot know which files or subroutines belong to each physics scheme, each relevant subroutine must be tagged with the module name. This allows doxygen to understand your modularized design and generate the documentation accordingly. -`Here is a list of modules `_ +`Here is a list of modules `_ defined in CCPP. A module is defined using: @@ -392,7 +389,7 @@ Bibliography Doxygen can handle in-line paper citations and link to an automatically created bibliography page. The bibliographic data for any papers that are cited need to be put in BibTeX format and saved in a .bib file. The .bib file for CCPP is -included in the :term:`CCPP Physics` repository (``ccpp-physics/physics/docs/library.bib``), +included in the :term:`CCPP Physics` repository (``ccpp-physics/physics/docs/library.bib``), and the doxygen configuration option ``cite_bib_files`` points to the included file. Citations are invoked with the following tag: @@ -472,11 +469,11 @@ Fortran files using the doygen markup below. !! \htmlinclude SUBROUTINE_NAME.html !! -The tables should be created using a Python script distributed with the :term:`CCPP Framework`, +The tables should be created using a Python script distributed with the :term:`CCPP Framework`, ``ccpp-framework/scripts/metadata2html.py``. .. note:: - You will need to set the environment variable ``PYTHONPATH`` to include the directories + You will need to set the environment variable ``PYTHONPATH`` to include the directories ``ccpp/framework/scripts`` and ``ccpp/framework/scripts/parse_tools``. As an example for bash-like shells: .. code-block:: @@ -499,7 +496,7 @@ directory as the scheme Fortran files (``ccpp-physics/physics``). To generate the complete Scientific Documentation, the script ``./ccpp/framework/scripts/metadata2html.py`` must be run separately for each ``.meta`` file available in ``ccpp-physics/physics``. Alternatively, a batch mode exists -that converts all metadata files associated with schemes and variable definitions in the CCPP prebuild config; +that converts all metadata files associated with schemes and variable definitions in the CCPP prebuild config; again using the SCM as an example: .. code-block:: fortran @@ -519,7 +516,7 @@ Using Doxygen In order to generate the doxygen-based documentation, you will need to follow five steps: #. Have the executables ``doxygen`` (https://doxygen.nl/), ``graphviz`` (https://graphviz.org/), - and ``bibtex`` (http://www.bibtex.org/) installed on your machine and in your ``PATH``. + and ``bibtex`` (https://www.bibtex.org/) installed on your machine and in your ``PATH``. These utilities can be installed on MacOS via `Homebrew `_, or installed manually via the instructions on each utility's page linked above. diff --git a/CCPPtechnical/source/conf.py b/CCPPtechnical/source/conf.py index 016e060..41e0099 100644 --- a/CCPPtechnical/source/conf.py +++ b/CCPPtechnical/source/conf.py @@ -145,7 +145,7 @@ def setup(app): # Latex figure (float) alignment # # 'figure_align': 'htbp', - 'maketitle': r'\newcommand\sphinxbackoftitlepage{For referencing this document please use: \newline \break Bernardet, L., G. Firl, D. Heinzeller, L. Pan, M. Zhang, M. Kavulich, J. Schramm, and L. Carson, 2022. CCPP Technical Documentation Release v6.0.0. Available at https://ccpp-techdoc.readthedocs.io/\textunderscore/downloads/en/v6.0.0/pdf/.}\sphinxmaketitle' + 'maketitle': r'\newcommand\sphinxbackoftitlepage{For referencing this document please use: \newline \break Bernardet, L., G. Firl, D. Heinzeller, L. Pan, M. Zhang, M. Kavulich, J. Schramm, and L. Carson, 2024. CCPP Technical Documentation Release v7.0.0. Available at https://ccpp-techdoc.readthedocs.io/\textunderscore/downloads/en/v7.0.0/pdf/.}\sphinxmaketitle' } # Grouping the document tree into LaTeX files. List of tuples @@ -202,7 +202,7 @@ def setup(app): # -- Options for intersphinx extension --------------------------------------- # Example configuration for intersphinx: refer to the Python standard library. -intersphinx_mapping = {'https://docs.python.org/': None} +intersphinx_mapping = {'python': ('https://docs.python.org/3', None)} # -- Options for todo extension ---------------------------------------------- diff --git a/CCPPtechnical/source/references.bib b/CCPPtechnical/source/references.bib index b01e43a..66c2b29 100644 --- a/CCPPtechnical/source/references.bib +++ b/CCPPtechnical/source/references.bib @@ -1,7 +1,7 @@ @article{Bernardet2018, title={Community infrastructure for facilitating and accelerating improvement and testing of physical parameterizations}, author={L. Bernardet}, - journal={WRF-GRAPES Modeling Workshop, Oct 17, Boulder, CO}, + journal={WRF-GRAPES Modeling Workshop, Oct 17, 2018, Boulder, CO}, url = {http://www2.mmm.ucar.edu/wrf/grapes/bernardet_2018.pdf}, year={2018}, }