Possible bugs in interpolation and nesting

Rapportation of bugs in the COHERENS code. Please, report in detail the problem/bug you found and provide the necessary information to reproduce the problem. This information will be useful to traceback the bug and fix it.

Moderator: pluyten

Post Reply
ropponen
Posts: 7
Joined: Thu Dec 04, 2014 11:45 am
Full name: Janne Ropponen

Possible bugs in interpolation and nesting

Post by ropponen »

Hi,

I'm having difficulties including gridded weather data in COHERENS V2.9 (and in V2.10.3). Both the model grid and the weather grid are uniform rectangular (nhtype=1), but different sizes. The weather grid has more coarse resolution.

The problem is that weather data doesn't interpolate correctly (or at all) in some calculation domains when using more than 10 domains (V2.9) or the model crashes in the initialisation phase or in early running (V2.10.3). The interpolation seemed to work better in COHERENS V2.5.1. Here are some example pictures of air temperature interpolation in V2.5.1 and in V2.9: Interpolation examples. The same effect is seen in other variables too. The problems seem to be the same regardless of whether the weather grid is applied to lake or marine cases (Cartesian or spherical coordinates).

My colleagues have been able to find some suspect code rows in several files that might or might not be related to the issue, but even trying to fix these hasn't solved the problem yet. These are listed below. There seems to be some trouble in interpolation and nesting. Nesting issues are probably unrelated to the interpolation problems, but I will list them here too.

I'd really appreciate some help with these issues.

In V2.9 the main issue seems to be the incorrect interpolation. In V2.10.3 initialisation seems unable to recognize that the model grid indeed is totally within the surface data grid.

Possible bugs:

V2.9: grid_interp.F90, rows 1389 and 1392:
outvals(i,j,:) = outflag ! Should this be flagout or 0.0? If flagout, the model crashes later as it tries to calculate with too large values.

V2.9: Nested_grids.F90, row 2761:
Looks like we need to add filepars = modfiles(io_nstgrd,iset,2) after the call to set_modfiles_atts, which changes the %novars attribute. Otherwise nesting crashes.

V2.10.3: grid_interp.F90, row 1658:
j.LT.lbounds(1) --> Should this be: j.LT.lbounds(2) ?

V2.10.3: grid_interp.F90, row 2856:
CALL hrel_coords_rect_unif: the parameters delxdat, delydat might be from the wrong grid (model vs. weather grid)?

Also, nesting with multiple variables (novars>1) seems unstable and writing/reading multivariable NetCDF files doesn't seem to work properly.

Regards,
- Janne
pluyten
Posts: 6
Joined: Wed Aug 13, 2014 2:38 pm
Full name: Patrick Luyten

Re: Possible bugs in interpolation and nesting

Post by pluyten »

Dear Janne,

Firstly, I need to apologise for this late reply, but I can't always find the time to check the forum. Some of the bugs have already been corrected in the latest (draft) version of the code which is not yet released. Other are new. Reply is given below

Much thanks for checking the code !

Regards,

Patrick

Hi,

I'm having difficulties including gridded weather data in COHERENS V2.9 (and in V2.10.3). Both the model grid and the weather grid are uniform rectangular (nhtype=1), but different sizes. The weather grid has more coarse resolution.

The problem is that weather data doesn't interpolate correctly (or at all) in some calculation domains when using more than 10 domains (V2.9) or the model crashes in the initialisation phase or in early running (V2.10.3). The interpolation seemed to work better in COHERENS V2.5.1. Here are some example pictures of air temperature interpolation in V2.5.1 and in V2.9: Interpolation examples. The same effect is seen in other variables too. The problems seem to be the same regardless of whether the weather grid is applied to lake or marine cases (Cartesian or spherical coordinates).

My colleagues have been able to find some suspect code rows in several files that might or might not be related to the issue, but even trying to fix these hasn't solved the problem yet. These are listed below. There seems to be some trouble in interpolation and nesting. Nesting issues are probably unrelated to the interpolation problems, but I will list them here too.

I'd really appreciate some help with these issues.

In V2.9 the main issue seems to be the incorrect interpolation. In V2.10.3 initialisation seems unable to recognize that the model grid indeed is totally within the surface data grid.

Possible bugs:

V2.9: grid_interp.F90, rows 1389 and 1392:
outvals(i,j,:) = outflag ! Should this be flagout or 0.0? If flagout, the model crashes later as it tries to calculate with too large values.
**This has already been corrected this in the upcoming release.

V2.9: Nested_grids.F90, row 2761:
Looks like we need to add filepars = modfiles(io_nstgrd,iset,2) after the call to set_modfiles_atts, which changes the %novars attribute. Otherwise nesting crashes.
**Is corrected in the upcoming release.

V2.10.3: grid_interp.F90, row 1658:
j.LT.lbounds(1) --> Should this be: j.LT.lbounds(2) ?
**You are right. Corrected in the upcoming release.

V2.10.3: grid_interp.F90, row 2856:
CALL hrel_coords_rect_unif: the parameters delxdat, delydat might be from the wrong grid (model vs. weather grid)?
**I think the code is correct here ?

Also, nesting with multiple variables (novars>1) seems unstable and writing/reading multivariable NetCDF files doesn't seem to work properly.
**This part of the code never been checked for a real application. Suggestions for improvement are welcome !

Regards,
- Janne
ropponen
Posts: 7
Joined: Thu Dec 04, 2014 11:45 am
Full name: Janne Ropponen

Re: Possible bugs in interpolation and nesting

Post by ropponen »

Thank you Patrick!

I totally understand your busy schedule. I'm very much looking forward to testing if the new code release will work for us. It has been difficult to try to fix these issues ourselves, as one fix seems to lead to another problem (or maybe we have tried to fix the wrong place), so we've been kind of stuck.

Working NetCDF reading/writing in nesting (in place of A or U format) would be very convenient because the data is then much more "readable" later. Now the code accepts netCDF as the format, but there seems to be some issues with actually using it. This can, however, be worked around by using the other formats.

Multivariable nesting is needed in applications using the sediment module, I think. We also (trying) use multivariable nesting with some of our biology module code. I'm not sure if this falls into feature request or bug category, but I mentioned it here because the code seems to accept novars>1 internally, but then doesn't work as expected.

I'll see if we can provide some example code.
Post Reply