Parcels with ECCO4v3 native grid #1012
Replies: 15 comments
-
Hi Louise, First, I've seen in the link you provided that ECCO4v3 runs on a C staggered grid, right? Then there is the question with the 13 tiles: You mean this grid?
I hope those suggestions help! Philippe |
Beta Was this translation helpful? Give feedback.
-
Hi Philippe,
Thanks for the support. We are currently trying to collate all the tiles into one file to make it easier to use with Parcels.
I have several other questions though:
- For reading a C grid in Parcels do we need to specifically use FieldSet.from_nemo as in the example ? I was wondering because it is the only FieldSet module that has the option tracer_interp_method=‘cgrid_tracer’ (at least from what I found in the documentation).
- Is it possible to deal with “lost” particles ? Let’s say if a particle reaches a certain area, to stop its advection ?
Thanks
Louise
… On Feb 28, 2019, at 6:54 AM, Philippe Delandmeter ***@***.***> wrote:
Hi Louise,
First, I've seen in the link you provided that ECCO4v3 runs on a C staggered grid, right?
You can read C grids naturally in Parcels: see for example https://nbviewer.jupyter.org/github/OceanParcels/parcels/blob/master/parcels/examples/tutorial_nemo_curvilinear.ipynb <https://nbviewer.jupyter.org/github/OceanParcels/parcels/blob/master/parcels/examples/tutorial_nemo_curvilinear.ipynb>
Then there is the question with the 13 tiles: You mean this grid?
https://ecco-v4-python-tutorial.readthedocs.io/_images/llc90.png <https://ecco-v4-python-tutorial.readthedocs.io/_images/llc90.png>
We haven't used such grid with Parcels so far, such that we haven't developed a specific way to handle it.
You could however do it. There are two ways to do it:
Option 1: You divide your field into 13 subfields. You can do this in Parcels loading 13 times the field, using indices to load only part of it. Then in your kernel defining the particle dynamics, you can set which field to interpolate, like:
if particle.lon < 10:
(u, v) = fieldset.UV_1[time, particle.depth, particle.lat, particle.lon]
elif ....:
(u, v) = fieldset.UV_2[time, particle.depth, particle.lat, particle.lon]
elif ....:
...
Option 2: You keep the field in one single piece, but you tune the functions in index_search.h. Those functions are looking in which cell your particle is located. It is a nicer but more complicate option that the previous one..
I hope those suggestions help!
Philippe
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub <#548 (comment)>, or mute the thread <https://github.com/notifications/unsubscribe-auth/At12OdsO01_cVVoJi6oOIn1ldE3MyEpCks5vR-29gaJpZM4bVVn8>.
|
Beta Was this translation helpful? Give feedback.
-
Hi Louise, You should use You can deal with lost particles in many ways. You can for example your own kernels, in a way like this:
Or you can also used recovery kernels (to do something instead of crashing if the particle is out the grid (see cells 4 and 5 in here https://github.com/OceanParcels/parcels/blob/master/parcels/examples/tutorial_Agulhasparticles.ipynb) |
Beta Was this translation helpful? Give feedback.
-
Hi Philippe,
We have made some progress in collating the ECCO4v3 tiles in one file for an easier use in Parcels. I thus tried to run a simple test with those fields but I run into this RuntimeWarning and then get an out of bounds error:
In [1]: from parcels import FieldSet, Field, ParticleSet, JITParticle, AdvectionRK4_3D, Variable, plotTrajectoriesFile,
...: ErrorCode
...: from netCDF4 import Dataset
...: from datetime import timedelta, datetime
...: from os import path
...: from glob import glob
...: import numpy as np
...:
...: dataDir = '/Users/rousselet.l/LOUISE/DATA/ECCO4v3/nctiles_climatology/'
...:
...: #load netcdf data
...: var = ['U','V']
...:
...: filenames = {'U': {'depth': dataDir + 'ecco4v3_collate_faces.nc',
...: 'lat': dataDir + 'ecco4v3_collate_faces.nc',
...: 'lon': dataDir + 'ecco4v3_collate_faces.nc',
...: 'data': dataDir + 'ecco4v3_collate_faces.nc'},
...: 'V': {'depth': dataDir + 'ecco4v3_collate_faces.nc',
...: 'lat': dataDir + 'ecco4v3_collate_faces.nc',
...: 'lon': dataDir + 'ecco4v3_collate_faces.nc',
...: 'data': dataDir + 'ecco4v3_collate_faces.nc'},
...: 'W': {'depth': dataDir + 'ecco4v3_collate_faces.nc',
...: 'lat': dataDir + 'ecco4v3_collate_faces.nc',
...: 'lon': dataDir + 'ecco4v3_collate_faces.nc',
...: 'data': dataDir + 'ecco4v3_collate_faces.nc'}}
...: variables = {'U': 'U',
...: 'V': 'V',
...: 'W': 'W'}
...: dimensions = {'lon': 'XG', 'lat': 'YG', 'depth': 'RC', 'time': 'time_counter'}
...: field_set = FieldSet.from_c_grid_dataset(filenames, variables, dimensions,mesh='spherical',tracer_interp_method=
...: 'cgrid_velocity')
WARNING: Casting lon data to np.float32
WARNING: Casting lat data to np.float32
WARNING: Casting depth data to np.float32
In [2]: pset = ParticleSet(fieldset=field_set, pclass=JITParticle,lon=0,lat=-40,depth=-50,time=0)
/Users/rousselet.l/miniconda3/envs/py3_parcels/lib/python3.7/site-packages/parcels/grid.py:139: RuntimeWarning: invalid value encountered in less
dx = np.where(dx < -180, dx+360, dx)
/Users/rousselet.l/miniconda3/envs/py3_parcels/lib/python3.7/site-packages/parcels/grid.py:140: RuntimeWarning: invalid value encountered in greater
dx = np.where(dx > 180, dx-360, dx)
In [3]: pset.execute(AdvectionRK4_3D,
...: runtime=timedelta(days=365),
...: dt=timedelta(hours=6),
...: output_file=pset.ParticleFile(name="test.nc", outputdt=timedelta(days=5)))
OutOfBoundsError: 0
Particle P[0](lon=0.000000, lat=-40.000000, depth=50.000000, time=0.000000)
Time: 1.0, timestep dt: 2.000000
Out-of-bounds sampling by particle at (0.000000, -40.000000, 50.000000)
Do you think this issue can come from the XG (Lon) grid and the fact that it is not strictly increasing or anything (see figure) ? What could be your suggestion ?
Thanks,
Louise
… On Mar 11, 2019, at 11:26 AM, Philippe Delandmeter ***@***.***> wrote:
Hi Louise,
You should use FieldSet.from_c_grid_dataset() (it's a new feature. If you use an older version of Parcels, you can use from_nemo.
The only functionality of this function is calling from_netcdf + setting the correct interp_method for each variable. It should be set to cgrid_velocity and cgrid_tracer. See more details about the c grid in Parcels in this submitted paper: https://www.geosci-model-dev-discuss.net/gmd-2018-339/ <https://www.geosci-model-dev-discuss.net/gmd-2018-339/>
You can deal with lost particles in many ways. You can for example your own kernels, in a way like this:
def AdvectionEE(particle, fieldset, time):
if particle.lat < 80:
(u1, v1) = fieldset.UV[time, particle.depth, particle.lat, particle.lon]
particle.lon += u1 * particle.dt
particle.lat += v1 * particle.dt
Or you can also used recovery kernels (to do something instead of crashing if the particle is out the grid (see cells 4 and 5 in here https://github.com/OceanParcels/parcels/blob/master/parcels/examples/tutorial_Agulhasparticles.ipynb <https://github.com/OceanParcels/parcels/blob/master/parcels/examples/tutorial_Agulhasparticles.ipynb>)
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub <#548 (comment)>, or mute the thread <https://github.com/notifications/unsubscribe-auth/At12OTnkSuAuu2kllI4RqsgZrdxkq5Vwks5vVp_KgaJpZM4bVVn8>.
|
Beta Was this translation helpful? Give feedback.
-
Hi Louise, and sorry for the late response. |
Beta Was this translation helpful? Give feedback.
-
As explained above, interpolating a grid divided in multiple tiles looks more like an unstructured mesh for which Parcels fails to find in which mesh cell is the particle. A more stable solution might become available when we use trees to find where is the particle (see #599) |
Beta Was this translation helpful? Give feedback.
-
We'll be soon tackling the same issue albeit on a much larger horizontal grid (13x4320x4320) and I'm not to understand where the library stands with regards to this case. Is #599 mature enough to tackle such pb? Are there any updates on this issue @delandmeterp ? pinging other people that may be interested in this issue: @fbriol, @rabernat, @xy6g13, @slgentil |
Beta Was this translation helpful? Give feedback.
-
Thanks @apatlpo, for checking on the status of native-ECCO grid support for Parcels. I am not aware that anyone has moved further with this, #599 is certainly not implemented yet. But now might be a good opportunity to start working on support for ECCO grid, if there is sufficient interest? Could you make a start, @apatlpo? I'm happy to help out if you get stuck! |
Beta Was this translation helpful? Give feedback.
-
there is no rush, I am at planning technical stage for a future postdoc work and parcels is one option. What would be the roadmap in order to support ECCO grids in Parcels? |
Beta Was this translation helpful? Give feedback.
-
A potential solution might be the I don't have an idea yet how best to do this plumbing. It could be with a custom advection kernel such as described above, or it might even work out-of-the-box because of the NestedField functionality (although I'm not sure how that would work in the case where nests are disjunct, so not embedded in each other). Actually, I'd be happy to take a quick look to see how easy it would be to make this work. Could be a nice challenge 😉. Could you send me a link to some (public) velocity data on this grid? |
Beta Was this translation helpful? Give feedback.
-
I could certainly provide sample data under the form of a zarr archive containing velocity fields and grid information (velocity would be 2D surface data). |
Beta Was this translation helpful? Give feedback.
-
Yes, that would certainly be useful. Only a few timesteps would be needed, I can loop over time if required |
Beta Was this translation helpful? Give feedback.
-
The easiest way to access the ECCO LLC90 data is here: https://catalog.pangeo.io/browse/master/ocean/ECCOv4r3/ I recommend not bothering with LLC4320 until LLC90 is sorted out. The first challenge here is the topology of the grid faces. The "big data" aspect of LLC4320 just makes everything harder. The way xgcm handles these issues is described here with an ECCO LLC90 example here. If you don't care about the arctic, a very easy solution is to rearrange everything south of ~60N into a quasi-lat-lon curvilinear grid (described a bit here: https://medium.com/pangeo/petabytes-of-ocean-data-part-1-nasa-ecco-data-portal-81e3c5e077be). Then you can forget about all this grid topology business. While I would love to see a solution with parcels, our group is using MITgcm in offline mode to do particle advection on the LLC4320 domain. I believe this is both the technically easiest and most computationally scalable approach, since the logic for exchanges across faces doesn't have to be re-implemented. That's what we used for this paper. |
Beta Was this translation helpful? Give feedback.
-
Thanks @rabernat for the information. Your advice about using LLC90 for development is very wise indeed.
|
Beta Was this translation helpful? Give feedback.
-
OK, I've spend today to see how far we could get with Parcels on native ECCO grid 'out-of-the-box. There's still a lot of work to be done, but it could be feasible See below some example code. I've downloaded two velocity fields from the ECCO portal because I couldn't get the Pangeo intake catalogue to work. Anyway, that should be relatively easy to change later import numpy as np
import xarray as xr
from datetime import timedelta as delta
from parcels import FieldSet, ParticleSet, JITParticle, plotTrajectoriesFile,
from parcels import ErrorCode, NestedField, AdvectionEE
# create an xarray DataSet
filenames = {'U': '/Users/erik/Downloads/EVEL_2016_01_01.nc',
'V': '/Users/erik/Downloads/NVEL_2016_01_01.nc'}
ds = xr.open_mfdataset([filenames['U'], filenames['V']], combine='by_coords')
# Create a Parcels FieldSet by making a NestedField of each tile
U = []
V = []
for tile in range(13):
fs = FieldSet.from_xarray_dataset(ds.sel(tile=tile, k=0),
{'U': 'EVEL', 'V': 'NVEL'},
{'lon': 'XC', 'lat': 'YC'})
U.append(fs.U)
V.append(fs.V)
fieldset = FieldSet(NestedField('U', U), NestedField('V', V))
# Define a ParticleSet
xv, yv = np.meshgrid(np.arange(-170, 170, 20), np.arange(-80, 80, 20))
pset = ParticleSet(fieldset=fieldset, pclass=JITParticle, lon=xv.flatten(), lat=yv.flatten())
# Make sure to remove the particles that start on land
def DeleteParticle(particle, fieldset, time):
particle.delete()
def RemoveOnLand(particle, fieldset, time):
u, v = fieldset.UV[time, particle.depth, particle.lat, particle.lon]
if math.fabs(u) < 1e-12:
particle.delete()
pset.execute(RemoveOnLand, dt=0, recovery={ErrorCode.ErrorOutOfBounds: DeleteParticle})
# Now run the remaining ParticleSet (using less accurate Euler Forward for speedup)
pset.execute(AdvectionEE, runtime=delta(days=100), dt=delta(hours=4),
output_file=pset.ParticleFile('ecco_particles.nc', outputdt=delta(days=1)),
recovery={ErrorCode.ErrorOutOfBounds: DeleteParticle})
plotTrajectoriesFile('ecco_particles.nc') I've used
Any ideas from others on how to fix especially problem 2? Is there e.g. a natural way in ECCO data itself to add this halo? |
Beta Was this translation helpful? Give feedback.
-
Hello everyone,
I have started to use Parcels with the ECCO4v3 climatology interpolated on a regular lat-lon grid but I was wondering if Parcels could handle the native ECCO4v3 LLC90 grid (divided in 13 tiles) (https://mitgcm.readthedocs.io/en/latest/algorithm/horiz-grid.html#) ?
If anyone has already been using Parcels with those kind of outputs would it be possible to provide an example ?
Thanks for your help
Louise
Beta Was this translation helpful? Give feedback.
All reactions