Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Spurious concentrations in geochemistry #293

Open
smolins opened this issue Jan 30, 2025 · 6 comments
Open

Spurious concentrations in geochemistry #293

smolins opened this issue Jan 30, 2025 · 6 comments
Assignees

Comments

@smolins
Copy link
Contributor

smolins commented Jan 30, 2025

I am confident that PRs #290, #291, and #292 successfully address the parallel issue with transport that was affecting reactive transport simulations. (The other issue would be the overshoot issue #286 due to interpolation of saturation when subcycling that is still pending, which can be temporally addressed by not subcycling.)

Unfortunately, it does not completely resolve the problems with reactive transport simulations. Back to running the hillslope problem (Molins et al 2022 Water Resources Research) but just for a 0.1 days, capping the time step to 70 s (to avoid subcycling) and plotting every 70 seconds too, see input file attached: hillslope_calcite_crunch_sigmoid_100s_xml.txt

The issue does not seem to be related to parallel runs although parallel runs seem to provide a clue that something is wrong with the very first cell in the subdomain handled by each MPI process, as the spurious concentrations appear at regular interval across the domain. I wonder if something is done differently with the first cell (at initialization or after solving transport) that affects (not so much the total concentration tcc) but the primary ion concentration (or another variable handled via Alquimia). The issue does not appear for non-reactive tracers (1 and 2), only with reactive species.

Time step 1, single processor (issue is at top left corner)

Image

Time step 1, parallel run (n=12) (issue at top left corner repeats itself downstream)

Image

Time step 10, single processor

Image

Time step 10, parallel run

Image

@smolins smolins self-assigned this Jan 30, 2025
@smolins
Copy link
Contributor Author

smolins commented Jan 30, 2025

@dasvyat: could you please have an early look at this and provide some clues as to what may be happening? Thanks.
It has me confused that it is just for the one cell at the beginning of the subdomain. The concentrations should all be the same in these examples (i.e. red).

@smolins
Copy link
Contributor Author

smolins commented Jan 30, 2025

data and mesh directories for this example are available in ats-demos under 13_integrated_hydro_reactive_transport

@levuvietphong
Copy link
Contributor

@smolins: Are the plots you show from a parallel run? If yes, how many cores did you use?

@smolins
Copy link
Contributor Author

smolins commented Jan 31, 2025

Plots 1 and 3 are for single processor runs. They show the spurious concentrations on the top left corner.

Plots 2 and 4 are for parallel runs using 12 processors. They show the spurious concentrations on the top left corner and then twice more downstream. This is a zoomed-in view. If showing the entire domain, one would see this pattern 12 times in total.

@levuvietphong
Copy link
Contributor

@smolins I re-run your input file, and it appears that:

  • The problem may be experiencing a dry condition on the surface since surface-water_source or rainfall=0 for the test period (0.1 day)
  • The colorbar range is very narrow (e.g., for Ca++, the difference between red and blue is ~1e-7), meaning the variation is small. This could be due to the dry conditions, and the tcc calculation may be sensitive to this dry condition.

I added some rainfall at the beginning, and this seems eliminate the weird pattern. Can you play with the rain and verify this?

@dasvyat
Copy link
Contributor

dasvyat commented Jan 31, 2025

I've re-run this case on multiple cores. There is a parallel pattern that indicates a bug versus dry cells or boundary conditions.
I'll take a look at this problem.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants