Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add ECCC driver for coupling with NEMO, update 'hadgem3' driver #605

Merged
merged 15 commits into from
Jun 10, 2021

Conversation

phil-blain
Copy link
Member

PR checklist

  • Short (1 sentence) summary of your PR:
    see title
  • Developer(s):
    P. Blain.
  • Suggest PR reviewers from list in the column to the right.
  • Please copy the PR test results link or provide a summary of testing completed below.
    Only driver changes. Was tested in a NEMO-CICE executable over the last 8 months.
  • How much do the PR code changes differ from the unmodified code?
    • bit for bit (no changes to cicecore outside drivers/)
    • different at roundoff level
    • more substantial
  • Does this PR create or have dependencies on Icepack or any other models?
    • Yes
    • No
  • Does this PR add any new test cases?
    • Yes
    • No
  • Is the documentation being updated? ("Documentation" includes information on the wiki or in the .rst files from doc/source/, which are used to create the online technical docs at https://readthedocs.org/projects/cice-consortium-cice/. A test build of the technical docs will be performed as part of the PR testing.)
    • Yes
    • No, does the documentation need to be updated at a later time?
      • Yes
      • No
  • Please provide any additional information or relevant details below:

Historically we've used the "hadgem3" driver when compiling NEMO and CICE together in a single executable. When I started updating our systems to CICE6, I also used this driver and had to fix it since it had not often been updated with the rest of the CICE6 code base. In the end, we've decided it would be better and clearer to add our own driver, so I recently copied the hadgem3 driver (after having fixed it) and made a small change that we had carried internally for some time (ba124c6).

@JFLemieux73 @dupontf

Add the required 'iblk' argument.
The subroutines 'prep_radiation', 'zsal_diags', 'bgc_diags' and 'hbrine_diags'
do not take a 'dt' argument anymore, so remove it.
This subroutine is inside an 'ICE_DA' CPP, which is not used in
any configuration. Remove it.
This '+' sign was copy-pasted there in error in 29b99b6 (CICE: Floe size
distribution  (CICE-Consortium#382), 2019-12-07). Remove it.
Remove the call to 'check_finished_file' as well as the definition
of the subroutine, as the 'hadgem3' driver is not used on machine 'bering'
and it's unclear if machine 'bering' still exists.
This was forgotten back in 8b0ae03 (Refactor tracer initialization (CICE-Consortium#235), 2018-11-16)
The hadgem3 driver was not updated when 'save_init' was added in 83686a3
(Implement box model test from 2001 JCP paper (CICE-Consortium#151), 2018-10-22). As
this subroutine is necessary to ensure proper initialization of the
model, add it now.
Other drivers use 'ilo,ihi' and 'jlo,jhi' here. Do the same.
In 066070e (Fix minor issues in documentation, key_ CPPs, bfbcomp return
codes (CICE-Consortium#532), 2020-11-23), the 'ice_communicate' module was updated to
remove CPP macros relating to the OASIS coupler (key_oasis*) and to the
NEMO ocean model (key_iomput). These CPPs were used to make the correct
MPI communicator accessible to the 'init_communicate' subroutine.
However, that subroutine already accepts an optional MPI communicator as
argument and it was deemed cleaner to require the driver layer to
explicitely pass the communicator instead of making it accessible
through 'use' statements.

Update the 'hadgem3' driver, used for coupling with NEMO, to explicitely
pass the NEMO MPI communicator 'mpi_comm_opa' to 'init_communicate'.
Historically the 'hadgem3' driver has been used when compiling a single
NEMO-CICE executable at ECCC.

Going forward, all driver-level adjustements will be done in our own
driver, 'nemo_concepts', 'CONCEPTS' being the name of the
multi-departmental collaboration around using the NEMO ocean model.

Copy CICE_InitMod, CICE_RunMod and CICE_FinalMod from the 'hadgem3'
directory to a new 'nemo_concepts' directory under 'drivers/direct'.

The following commits will clean up this new driver and port over some
in-house adjustments.
This subroutine was only called on machine 'bering', which is not an
ECCC machine and probably does not exist anymore anyway. Remove it.
Since 'merge_fluxes' is called with aice_init, it is more consistent to
also call 'scale_fluxes', in 'coupling_prep' with 'aice_init'
instead of 'aice'.

Copy this in-house change to the new 'nemo_concepts' driver.
Copy link
Contributor

@eclare108213 eclare108213 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks fine to me. A general question for whomever might be watching:
Is anyone else still using the hadgem3 driver from this repository?

@apcraig
Copy link
Contributor

apcraig commented Jun 3, 2021

Just to clarify, ECCC is using the nemo_concepts driver now? And the hadgem3 modifications just fix a few issues that were identified when that driver was being tested prior to switching drivers?

Where did the name "nemo_concepts" come from? If everyone thinks this is a good name, we can go for it, but what about a name like "eccc" or just "nemo" instead? Or even "ecccnemo" or "nemoeccc" . I assume that nemo is calling CICE in this configuration and that this is specifically implemented for the eccc model? Does the eccc model have a model name (ie ncar vs cesm). Maybe it (or parts) could be used for another application, but I think we've found that each application model tends to end up having at least a little bit of customization. An ongoing question is how much effort to make to keep multiple applications synced up to a single driver implementation (say all nuopc users). That may impact some of these driver names, but it's hard to predict how these are going to be used in the future, so it is hard to pick good driver names. I don't think the CICE Consortium needs to be too heavy handed in that regard in any case.

My personal preference is a directory name without an _ to be consistent with other names we have now. And for the name to reflect more explicitly and simply the coupling as suggested by names above and like the other driver names. But again, I think the driver space is largely driven by the applications, and I'm not going to enforce any particular driver directory naming convention. Ultimately, whatever name is chosen here is likely to be long-lived, so I think it's just good to ask whether we're happy with the driver name whenever a new driver is introduced.

@phil-blain
Copy link
Member Author

Just to clarify, ECCC is using the nemo_concepts driver now?

No. We are using the "hadgem3" driver for our operational systems (which run CICE 4.0), though not the one from this repository. I'm not exactly sure how we acquired it, since it's not present in the CICE 4.0 drivers directory
(https://github.com/CICE-Consortium/CICE-svn-trunk/tree/svn/tags/release-4.0/drivers). I guess it was passed to us by UKMet. Anyway, we have some custom modifications on top of it also.

And the hadgem3 modifications just fix a few issues that were identified when that driver was being tested prior to switching drivers?

When I started the project of updating to CICE6, I used the hadgem3 driver from this here repository, and fixed it so it would build and run without crashing.

Where did the name "nemo_concepts" come from? If everyone thinks this is a good name, we can go for it, but what about a name like "eccc" or just "nemo" instead? Or even "ecccnemo" or "nemoeccc" .

As I mention in the commit message of fa7ab6c, we felt that we did not want to limit the driver name to ECCC since it's also used by other departments within the government of Canada, under an initiative named "CONCEPTS" .

I assume that nemo is calling CICE in this configuration and that this is specifically implemented for the eccc model?

Yes, NEMO and CICE are compiled in a single executable and NEMO calls CICE_Initialize, CICE_Run and CICE_Finalize. This is not specific to ECCC, it is in the NEMO model (see http://forge.ipsl.jussieu.fr/nemo/browser/NEMO/releases/release-3.6/NEMOGCM/NEMO/OPA_SRC/SBC/sbcice_cice.F90) although we also have custom modification on top in this file.


AS for the name of the driver, I don't have a strong opinion. Maybe @JFLemieux73 and @dupontf want to add more thoughts...

@apcraig
Copy link
Contributor

apcraig commented Jun 3, 2021

Thanks @phil-blain. So the current operational ECCC model is using a custom driver (not in the consortium) and is coupled to CICE4. CICE6 works in some set of Canadian projects with the hadgem3 driver. It also works with the nemo_concepts driver (which is very similar to the hadgem3 driver). And it's the nemo_concepts driver that a set of Canadian projects are likely to use moving forward. Makes sense.

Thanks for the "concepts" explanation. Again, I'm fine keeping it as "nemo_concepts" if that's what you want.

@phil-blain
Copy link
Member Author

Also, I forgot to answer to that question:

Does the eccc model have a model name (ie ncar vs cesm).

So just to clarify: we have several (some would same too many!) forecasting systems that have lovely acronyms like GDPS (Global Deterministic Prediction System), RIOPS (Regional Ice-Ocean Prediction System), etc. but they all use the same model(s), i.e. NEMO for the ocean and CICE for the sea-ice. At least that's the terminology that we are using.

@apcraig
Copy link
Contributor

apcraig commented Jun 8, 2021

Anyone want to change nemo_concepts or any other comments? If I don't hear otherwise, I'll merge tomorrow.

@apcraig apcraig merged commit a63cc1c into CICE-Consortium:master Jun 10, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants