Skip to content

Create E3SM coupling files within COMPASS#417

Merged
mark-petersen merged 3 commits intoMPAS-Dev:ocean/developfrom
mark-petersen:ocean/coupling_E3SM
Feb 2, 2020
Merged

Create E3SM coupling files within COMPASS#417
mark-petersen merged 3 commits intoMPAS-Dev:ocean/developfrom
mark-petersen:ocean/coupling_E3SM

Conversation

@mark-petersen
Copy link
Contributor

@mark-petersen mark-petersen commented Dec 13, 2019

Adds a script that produces coupling files for E3SM. This PR adds it to the QU240 as a test, but it is set up in a general way, so that any global case can add it to the init process.

The goals for this system are:

  • speed: An automated method
  • simplicity: so others can use and alter it
  • process documentation: ability to record a standard sequence of steps
  • expandability: so special cases, like ice shelves and data icebergs, are easy to handle.

I think this method of a single python script within COMPASS satisfies these.

To add this to any case, add a link

cd testing_and_setup/compass/ocean/global_ocean/RESOLUTION/init
ln -isf ../../config_files/config_e3sm_coupling.xml  .

Then set up an init case for that resolution, and there is a new directory e3sm_coupling with a run.py script.

Documentation of all the steps for coupling files are at

@mark-petersen
Copy link
Contributor Author

mark-petersen commented Dec 13, 2019

This works on grizzly as follows (more details on this confluence page)

cd testing_and_setup/compass/
./list_testcases.py |grep QU240|grep init
  69: -o ocean -c global_ocean -r QU240 -t init
./setup_testcase.py -f general.config.ocean_turq --work_dir $WORKDIR -n 69

First you have to make your initial condition. I assume you already have MPAS-O compiled with gnu:

salloc -N 1 -t 2:0:0 --qos=interactive
cd $WORKDIR/ocean/global_ocean/QU240/init
module use /usr/projects/climate/SHARED_CLIMATE/modulefiles/all
module load gcc/5.3.0 openmpi/1.10.5 netcdf/4.4.1 parallel-netcdf/1.5.0 pio/1.7.2
module unload python; 
source /usr/projects/climate/SHARED_CLIMATE/anaconda_envs/load_compass.sh;

That is all standard. Now make the coupling files. This requires the e3sm-unified package.

cd e3sm_coupling
module unload python
source /usr/projects/climate/SHARED_CLIMATE/anaconda_envs/load_latest_e3sm_unified.sh
./run.py
The output is as follows:

****** initial_condition_ocean ******
ncks -x -v xtime -O oQU240.nc oQU240_no_xtime.nc
SUCCESS

****** graph_partition_ocean ******
Creating graph files between  1  and  73
gpmetis mpas-o.graph.info.191213 2
gpmetis mpas-o.graph.info.191213 4
gpmetis mpas-o.graph.info.191213 8
gpmetis mpas-o.graph.info.191213 16
gpmetis mpas-o.graph.info.191213 32
gpmetis mpas-o.graph.info.191213 64
gpmetis mpas-o.graph.info.191213 12
gpmetis mpas-o.graph.info.191213 24
gpmetis mpas-o.graph.info.191213 36
gpmetis mpas-o.graph.info.191213 48
gpmetis mpas-o.graph.info.191213 60
gpmetis mpas-o.graph.info.191213 72
SUCCESS

****** initial_condition_seaice ******
ncks -x -v bottomDepth,refBottomDepth,restingThickness, temperature,salinity,    temperatureSurfaceValue,salinitySurfaceValue,surfaceVelocityZonal,    surfaceVelocityMeridional,SSHGradientZonal,SSHGradientMeridional,    vertNonLocalFluxTemp,normalVelocity,layerThickness,normalBarotropicVelocity,    vertCoordMovementWeights,boundaryLayerDepth,seaIcePressure,    atmosphericPressure,filteredSSHGradientZonal,filteredSSHGradientMeridional -O oQU240.nc seaice.oQU240.nc
SUCCESS

****** scrip ******
create_SCRIP_file_from_MPAS_mesh.py -m oQU240.nc -s ocean.oQU240.scrip.191213.nc
SUCCESS

****** transects_and_regions ******
MpasMaskCreator.x oQU240.nc masks_SingleRegionAtlanticWTransportTransects.oQU240.nc -f SingleRegionAtlanticWTransportTransects.geojson
SUCCESS

****** mapping_Gcase ******
mpirun -n 1 ESMF_RegridWeightGen --method conserve --source ocean.oQU240.scrip.191213.nc --destination T62_040121.nc --weight map_oQU240_TO_T62_040121_aave.191213.nc --ignore_unmapped
mpirun -n 1 ESMF_RegridWeightGen --method conserve --source T62_040121.nc --destination ocean.oQU240.scrip.191213.nc --weight map_T62_040121_TO_oQU240_aave.191213.nc --ignore_unmapped
mpirun -n 1 ESMF_RegridWeightGen --method bilinear --source ocean.oQU240.scrip.191213.nc --destination T62_040121.nc --weight map_oQU240_TO_T62_040121_blin.191213.nc --ignore_unmapped
mpirun -n 1 ESMF_RegridWeightGen --method bilinear --source T62_040121.nc --destination ocean.oQU240.scrip.191213.nc --weight map_T62_040121_TO_oQU240_blin.191213.nc --ignore_unmapped
mpirun -n 1 ESMF_RegridWeightGen --method patch --source ocean.oQU240.scrip.191213.nc --destination T62_040121.nc --weight map_oQU240_TO_T62_040121_patc.191213.nc --ignore_unmapped
mpirun -n 1 ESMF_RegridWeightGen --method patch --source T62_040121.nc --destination ocean.oQU240.scrip.191213.nc --weight map_T62_040121_TO_oQU240_patc.191213.nc --ignore_unmapped
SUCCESS

****** domain ******
./gen_domain -m map_oQU240_TO_T62_040121_aave.191213.nc -o oQU240 -l T62
SUCCESS

****** SUCCESS for all enabled steps ******

Each step is conducted in its own directory, with command_history and log.out files. Each step is just a sequence of unix commands that are both dumped to the screen and recorded in the command_history files.

@mark-petersen
Copy link
Contributor Author

One of the beautiful parts of this system is it creates a directory structure that is identical to the E3SM inputdata directory

tree assembled_files_for_upload

assembled_files_for_upload/
└── inputdata
    ├── cpl
    │   └── cpl6
    │       ├── map_oQU240_TO_T62_040121_aave.191213.nc -> ../../../../mapping_Gcase/map_oQU240_TO_T62_040121_aave.191213.nc
    │       ├── map_oQU240_TO_T62_040121_blin.191213.nc -> ../../../../mapping_Gcase/map_oQU240_TO_T62_040121_blin.191213.nc
    │       ├── map_oQU240_TO_T62_040121_patc.191213.nc -> ../../../../mapping_Gcase/map_oQU240_TO_T62_040121_patc.191213.nc
    │       ├── map_T62_040121_TO_oQU240_aave.191213.nc -> ../../../../mapping_Gcase/map_T62_040121_TO_oQU240_aave.191213.nc
    │       ├── map_T62_040121_TO_oQU240_blin.191213.nc -> ../../../../mapping_Gcase/map_T62_040121_TO_oQU240_blin.191213.nc
    │       └── map_T62_040121_TO_oQU240_patc.191213.nc -> ../../../../mapping_Gcase/map_T62_040121_TO_oQU240_patc.191213.nc
    ├── ice
    │   └── mpas-cice
    │       └── oQU240
    │           └── seaice.oQU240.nc -> ../../../../../initial_condition_seaice/seaice.oQU240.nc
    ├── ocn
    │   └── mpas-o
    │       └── oQU240
    │           ├── masks_SingleRegionAtlanticWTransportTransects.oQU240.nc -> ../../../../../transects_and_regions/masks_SingleRegionAtlanticWTransportTransects.oQU240.nc
    │           ├── mpas-o.graph.info.191213 -> ../../../../../graph_partition_ocean/mpas-o.graph.info.191213
    │           ├── mpas-o.graph.info.191213.part.12 -> ../../../../../graph_partition_ocean/mpas-o.graph.info.191213.part.12
    │           ├── mpas-o.graph.info.191213.part.16 -> ../../../../../graph_partition_ocean/mpas-o.graph.info.191213.part.16
    │           ├── mpas-o.graph.info.191213.part.2 -> ../../../../../graph_partition_ocean/mpas-o.graph.info.191213.part.2
    │           ├── mpas-o.graph.info.191213.part.24 -> ../../../../../graph_partition_ocean/mpas-o.graph.info.191213.part.24
    │           ├── mpas-o.graph.info.191213.part.32 -> ../../../../../graph_partition_ocean/mpas-o.graph.info.191213.part.32
    │           ├── mpas-o.graph.info.191213.part.36 -> ../../../../../graph_partition_ocean/mpas-o.graph.info.191213.part.36
    │           ├── mpas-o.graph.info.191213.part.4 -> ../../../../../graph_partition_ocean/mpas-o.graph.info.191213.part.4
    │           ├── mpas-o.graph.info.191213.part.48 -> ../../../../../graph_partition_ocean/mpas-o.graph.info.191213.part.48
    │           ├── mpas-o.graph.info.191213.part.60 -> ../../../../../graph_partition_ocean/mpas-o.graph.info.191213.part.60
    │           ├── mpas-o.graph.info.191213.part.64 -> ../../../../../graph_partition_ocean/mpas-o.graph.info.191213.part.64
    │           ├── mpas-o.graph.info.191213.part.72 -> ../../../../../graph_partition_ocean/mpas-o.graph.info.191213.part.72
    │           ├── mpas-o.graph.info.191213.part.8 -> ../../../../../graph_partition_ocean/mpas-o.graph.info.191213.part.8
    │           ├── ocean.oQU240.scrip.191213.nc -> ../../../../../scrip/ocean.oQU240.scrip.191213.nc
    │           └── oQU240_no_xtime.nc -> ../../../../../initial_condition_ocean/oQU240_no_xtime.nc
    └── share
        └── domains
            ├── command_history
            ├── domain.lnd.T62_oQU240.191213.nc -> ../../../../domain/domain.lnd.T62_oQU240.191213.nc
            ├── domain.ocn.oQU240.191213.nc -> ../../../../domain/domain.ocn.oQU240.191213.nc
            ├── domain.ocn.T62_oQU240.191213.nc -> ../../../../domain/domain.ocn.T62_oQU240.191213.nc
            ├── gen_domain -> /usr/projects/climate/mpeterse/repos/E3SM/compiled_cime_tools/cime/tools/mapping/gen_domain_files/src/gen_domain
            ├── log.out
            └── map_oQU240_TO_T62_040121_aave.191213.nc -> ../mapping_Gcase/map_oQU240_TO_T62_040121_aave.191213.nc

This is populated with links only, and all file names are correct and ready for the inputdata repo. To grab them all, you can

cd assembled_files_for_upload
tar cvf inputdata.tar inputdata -h

and copy it over to another machine. Here tar -h dumps the linked files.

Comment on lines 18 to 27
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm really concerned that these hard coded paths defeat the purpose of having COMPASS try to be a self-contained, freely available framework with as few preparatory steps as possible. These things can only be done on LANL IC by someone in the climate group.

I'll let this go through as a temporary solution as long as it's accompanied by an issue tracking the need to generalize this ASAP.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I completely agree. It is on my to do list to move everything out to geometric features, e3sm_unified, or the E3SM inputdata repo. But at least we have a list of what is left outside the repos right now.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@xylar could we add gen_domain to e3sm_unified? I'm not sure if that means add it to MPAS-Tools, or somewhere else. It is this code:
https://github.com/E3SM-Project/E3SM/tree/master/cime/tools/mapping/gen_domain_files

@xylar
Copy link
Collaborator

xylar commented Dec 13, 2019

I'll try to give this a more thorough review when I have time. Just mentioning the concern that comes to mind at a quick glance.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Another to-do: To be general, COMPASS needs to read its own resolution name. There may be a resolution variable available within COMPASS (I have not seen it). Otherwise, create_E3SM_coupling_files.py could try to parse its path to automatically collect the resolution string QU240.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There may be a resolution variable available within COMPASS (I have not seen it).

An argument gets added to the config file here:
https://github.com/MPAS-Dev/MPAS-Model/blob/master/testing_and_setup/compass/setup_testcase.py#L1640

        config.set('script_input_arguments', 'resolution', args.resolution)

However, we are still using the python 2 version of the config parser:
https://github.com/MPAS-Dev/MPAS-Model/blob/master/testing_and_setup/compass/setup_testcase.py#L25
available in python 3 via the six package.

from six.moves import configparser

That config parser doesn't let you refer to config options in sections other than the one you're in. If we switch to the python 3 config parser exclusively (dropping python 2 support), we could use "extended interpolation":
https://docs.python.org/3/library/configparser.html#configparser.ExtendedInterpolation
do something like:

[main]
mesh_name = o${script_input_arguments:resolution}

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this would be a pretty trivial change as long as we're willing to drop python 2 support in COMPASS.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I take this back. This is a separate config file, parsed by a separate script so there isn't a way to get resolution from the config object in setup_testcase.py. It is probably necessary to parse it out of the path.

@mark-petersen
Copy link
Contributor Author

@kristin-hoch and @darincomeau let's use this PR to create the coupling files for the Southern Ocean mesh. I would like to add the special steps required for ice shelves and data ice bergs.

Copy link
Collaborator

@xylar xylar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Some comments to get you started. The biggest is that ESMF should ideally be run on many MPI tasks but I don't know if this is possible because IC doesn't support the mpich implementation that comes from conda-forge and is linked into conda-forge ESMF (and many other packages).

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I take this back. This is a separate config file, parsed by a separate script so there isn't a way to get resolution from the config object in setup_testcase.py. It is probably necessary to parse it out of the path.

Copy link
Collaborator

@xylar xylar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

With the changes you've made, this looks good to me. Let me know if you'd like me to run some specific tests.

@mark-petersen
Copy link
Contributor Author

Note: just spoke with @darincomeau. I should add a flag under main for CORE vs JRA forcing, which modifies the mapping files. Also, when using --ice_shelf_cavities, runoff maps need an additional step to use copy_cell_indices_from_noLI_to_withLI.py so that runoff does not go under ice shelves.

This is a combination of 17 commits.
Added e3sm_coupling. Works on grizzly.

Remove some flags from config file

small changes

Alterations from Xylar PR review

Add arg parser. Did not work

Autodetect mesh_name from path. Works.

Change function_list to functions

add readme

Add ice_shelf_cavities flag for scrip and mapping files

Add runoff mapping

Add salinity restoring. Works.

PEP8 compliance

Add nomaskStr for land ice

Add date_string config. May also autodetect

Added flag for ice_shelf_cavities

Add folds

Remove changes to template_forward.xml
@mark-petersen
Copy link
Contributor Author

@darincomeau, I added directories for JRA mapping files, and the ability to run @matthewhoffman's alteration of the runoff mapping to not go below ice shelves. I'm merging this so it doesn't get too old, but I'd like to go over that and make adjustments in a few weeks.

mark-petersen added a commit that referenced this pull request Feb 2, 2020
Adds a script that produces coupling files for E3SM. This PR adds it to the QU240 as a test, but it is set up in a general way, so that any global case can add it to the init process.

The goals for this system are:
- speed: An automated method
- simplicity: so others can use and alter it
- process documentation: ability to record a standard sequence of steps
- expandability: so special cases, like ice shelves and data icebergs, are easy to handle.

I think this method of a single python script within COMPASS satisfies these.

To add this to any case, add a link
```
cd testing_and_setup/compass/ocean/global_ocean/RESOLUTION/init
ln -isf ../../config_files/config_e3sm_coupling.xml  .
```
Then set up an init case for that resolution, and there is a new directory e3sm_coupling with a run.py script.

Documentation of all the steps for coupling files are at
- [Confluence: Simplify E3SM Ocean File Generation](https://acme-climate.atlassian.net/wiki/spaces/OCNICE/pages/763854989/Simplify+E3SM+Ocean+File+Generation)
- [Confluence: Making mapping, runoff, domain files and adding grids to ACME](https://acme-climate.atlassian.net/wiki/spaces/OCNICE/pages/22052884/Making+mapping+runoff+domain+files+and+adding+grids+to+ACME)
@mark-petersen mark-petersen merged commit ba401b4 into MPAS-Dev:ocean/develop Feb 2, 2020
@mark-petersen mark-petersen deleted the ocean/coupling_E3SM branch February 2, 2020 23:12
ashwathsv pushed a commit to ashwathsv/MPAS-Model that referenced this pull request Jul 21, 2020
Adds a script that produces coupling files for E3SM. This PR adds it to the QU240 as a test, but it is set up in a general way, so that any global case can add it to the init process.

The goals for this system are:
- speed: An automated method
- simplicity: so others can use and alter it
- process documentation: ability to record a standard sequence of steps
- expandability: so special cases, like ice shelves and data icebergs, are easy to handle.

I think this method of a single python script within COMPASS satisfies these.

To add this to any case, add a link
```
cd testing_and_setup/compass/ocean/global_ocean/RESOLUTION/init
ln -isf ../../config_files/config_e3sm_coupling.xml  .
```
Then set up an init case for that resolution, and there is a new directory e3sm_coupling with a run.py script.

Documentation of all the steps for coupling files are at
- [Confluence: Simplify E3SM Ocean File Generation](https://acme-climate.atlassian.net/wiki/spaces/OCNICE/pages/763854989/Simplify+E3SM+Ocean+File+Generation)
- [Confluence: Making mapping, runoff, domain files and adding grids to ACME](https://acme-climate.atlassian.net/wiki/spaces/OCNICE/pages/22052884/Making+mapping+runoff+domain+files+and+adding+grids+to+ACME)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants