-
Notifications
You must be signed in to change notification settings - Fork 3
[Bug] - Derivation MetaSWAP mappings fails with Dask >= 2025.2.0 #1436
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Labels
bug
Something isn't working
Comments
5 tasks
Just looked further into this, I think our issue is unrelated to the Dask issue. So what I did never worked with dask in the first place. import numpy as np
a = np.zeros((3,3))
mask = np.array(
[
[True, True, False],
[False, False, True],
[False, True, False]
]
)
n_count = np.sum(mask)
a[mask] = np.arange(1, n_count+1)
print(a) # With numpy it works!
# %%
import dask.array
da = dask.array.from_array(a, chunks=(3,3))
dmask = dask.array.from_array(mask, chunks=(3,3))
da[dmask] = np.arange(1, n_count+1) # Error Throws error:
|
github-merge-queue bot
pushed a commit
that referenced
this issue
Feb 25, 2025
Fixes #1436 # Description Work around Dask issue by forcing a load of idomain and svat into memory. These grids will never be as large as the boundary condition grids with a time axis, so I think it is acceptable. At least for the LHM it is. I couldn't get it to work with a ``where``: numpy only accepts 1D arrays on a boolean indexed 3D as long as the 1D array has the length of one of the dimensions, it then does what we want to do here. However when this is not the case, which is 95% of the time, numpy throws an error. I therefore had to resort to loading arrays into memory. Furthermore I refactored ``test_coupler_mapping.py`` to be able to reduce some code duplication and add a test case with a dask array. For this I had to make the well package used consistent (From 2 wells to 3 wells, where the second well now lays on y=2.0 instead of y=1.0), so I updated some of the assertions. # Checklist - [x] Links to correct issue - [x] Update changelog, if changes affect users - [x] PR title starts with ``Issue #nr``, e.g. ``Issue #737`` - [x] Unit tests were added - [ ] **If feature added**: Added/extended example
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Bug description
By sheer coincidence, @WouterSwierstra and I ran into a bug that arose with the latest Dask release. Problematic are lines like this:
imod-python/imod/msw/coupler_mapping.py
Line 120 in 4c4e6e1
If we load this into memory as numpy arrays, everything is fine, however keeping them as lazy objects causes values not to be set or causes an error like:
ValueError: cannot broadcast shape (10,) to shape (nan,)
. I think I'm doing something here that was not intended to work with Dask and somehow magically worked.And even worse:
imod-python/imod/msw/grid_data.py
Line 98 in 4c4e6e1
Here it tries to set a read-only array in case Dask is used:
.values
then returns a read-only array.Related issue: dask/dask#11753, this provides a potential solution:
So I therefore think using a
where
is a lot safer here. Operations are on 2D grids, without time dimension, so I don't foresee huge performance issues by doing this.Furthermore: The fact that we missed this is probably caused by our test bench loading everything into memory, otherwise this bug would have surfaced earlier. Dask 2025.2.0 is in the present dev environment.
Refinement
where
instead.The text was updated successfully, but these errors were encountered: