-
Notifications
You must be signed in to change notification settings - Fork 388
Ocean/coastal update (push from ocean/develop) #385
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ocean/coastal update (push from ocean/develop) #385
Conversation
In the GM bolus calculation, there is a specified diffusivity and phase speed (see Ferrari et al. 2010). Currently MPAS-O assumes these values are fixed in time and space. This PR allows separate flags to enable 2D varying (+time) phase speed (based on the first baroclinic mode phase speed) and a separate flag for 3D (+time) varying diffusivity.
Updated to v.0.1.2
There was an issue with testing using matplotlib that has (hopefully) been fixed
…bolus velocity * temperature to eddy stats
…develop Updated to v.0.1.3
…develop Added monthly mean zonal and meridional bolus velocities as well as bolus velocity times temperature to eddy stats for Labrador Sea bias/ GM testing
Removes the vertical averaging of the GM kappa bolus diffusivity. Testing shows this functionality is not beneficial to simulations
In the GM bolus calculation, there is a specified diffusivity and phase speed (see Ferrari et al. 2010). Currently MPAS-O assumes these values are fixed in time and space. This PR allows separate flags to enable 2D varying (+time) phase speed (based on the first baroclinic mode phase speed) and a separate flag for 3D (+time) varying diffusivity.
Change order of default namelist to be: main; tracer groups; analysis members; init mode
…velop * ocean/add_gm_to_compass_testing: Change order of default namelist and suggested COMPASS namelist Add GM flags to COMPASS global ocean test
…elop This PR fixes a threading issue that was introduced in [E3SM PR#3057] (E3SM-Project/E3SM#3057) that brought in the initial implementation of 3D varying GM bolus. After that commit, some E3SM threading tests were observed to fail with different results. Tested with: * SMS_P12x2.T62_oQU240.CMPASO-NYF.compy_intel - BFB results 10 times * SMS_P12x2.ne4_oQU240.A_WCYCL1850.compy_intel.allactive-mach_mods - BFB * results 10 times This last test is one that was failing on sandiatoss3.
Minor updates to MPAS framework. Includes: - Log file corrections - python 3 correction in COMPASS - changes to atmospheric core
layerThickness pointer was passed into a subroutine but it is unassociated, and never used in that subroutine. This causes an error with Intel 19 compilers. This PR simply removes the unused variable.
This PR is an accumulation of PRs into the ocean/coastal branch. We are bringing them in at once for efficiency: MPAS-Dev#285, MPAS-Dev#289, MPAS-Dev#284, MPAS-Dev#295, MPAS-Dev#310, MPAS-Dev#311, MPAS-Dev#312, MPAS-Dev#335, MPAS-Dev#354, MPAS-Dev#356, MPAS-Dev#358, MPAS-Dev#359, MPAS-Dev#365, MPAS-Dev#371
This fixes these: 1. The seg fault mentioned in MPAS-Dev#374 (comment). When we use: `config_am_mocstreamfunction_normal_velocity_value = 'normalTransportVelocity'` it tries to access from the 'state' var_struct rather than 'diagnostic' here: `496 call mpas_pool_get_array(statePool, normalVelocityArrayName, normalVelocity, timeLevel)` I propose that we always use 'normalTransportVelocity' here for all simulations, and remove this flag. 2. The StrLen in the new MOC mask file is 1024 rather than 64. This causes all string arrays to be read in as blank. In particular, `transectGroupNames` and `regionGroupNames`, so a match is never found, and `regionGroupNumber` and `transectGroupNumber` remain zero and can cause a seg fault. It appears StrLen=64 is hardwired within MPAS framework, though I couldn't find it. The simple solution here is to just use StrLen=64. On the current mask file: `ncks -d StrLen,0,63 in.nc out.nc`
These three 3D variables are in the Registry, and one is obtained, but none are used. Git blame shows they were added by Todd and Doug in 2013, but then Qingshan Chen added his own naming scheme for GM.
…evelop Currently, rainFlux and evaporationFlux are in the thicknessBulkPKG package. If config_use_bulk_thickness_flux = .false. then the package is off and these arrays are not allocated. But they are accessed within various loops, and this causes a seg fault. In E3SM surface fluxes are always on. The seg fault occurs in idealized cases where thickness flux is off. For those cases, it is preferable to simply compute with zeros, rather than split up loops and put in checks for the packages, which reduces the performance of global runs.
This is a simple mistake that needs to be fixed. Causes a seg fault when using more than three regions.
|
How do you do a pull request of a force push? I thought a PR was always a merge. |
|
@xylar, it was just a direct push, not a force-push. I'll edit the above description accordingly because the force-push wasn't needed. |
|
Thanks! |
|
I think @mark-petersen has in mind to do a hard reset so ocean/coastal points to the exact same hash as ocean/develop. I would think that would require a force push but I could be wrong. |
|
I think we have that, unless I've missed something. Are you seeing something different @xylar? └─▪ git fetch --all
...
Fetching pwolfram
...
6fe9af3..7053845 ocean/coastal -> MPAS-Dev/ocean/coastal
559c823..7053845 ocean/develop -> MPAS-Dev/ocean/develop
... |
|
Yes, this all looks right. |
|
We also have a record of what we did, which is key. It is possible this has broken something, but at least we have a landing page and record of when it went wrong (and potentially how in terms of a commit range). |
|
Okay, like I said, I just didn't know that was possible. |
This is just a reset of ocean/coastal using ocean/develop following #384 as recommended by @mark-petersen. This is not strictly needed but is to document the push of
ocean/developontoocean/coastal.