Skip to content

Conversation

@glesur
Copy link
Contributor

@glesur glesur commented Nov 27, 2025

This PR fixes a long-standing issue causing the face-centered magnetic field component B (referred to as BXs) to drift at MPI subdomain boundaries during long integrations. The drift occurs because neighboring subdomains may compute slightly different edge EMFs—likely due to roundoff differences—resulting in inconsistent BXs values across shared boundaries.

This bug is subtle and can manifest in several ways:

  • A sudden increase in div(B) after restarting from a dump file. Because the dump stores only one of the two possible BXs values at each subdomain boundary, any accumulated mismatch becomes visible upon restart.
  • Unexpected or non-physical BXs values at subdomain edges when using vector potentials.

Historically, these edge zones were not exchanged, under the assumption that each subdomain’s computed values were sufficient. In practice, this allowed roundoff-level discrepancies to accumulate over time.

This PR introduces a consistent synchronization rule: the BXs value from the left side (=start) of each subdomain is treated as the authoritative value and overwrites the corresponding right-side value of the left-side neighbour. This ensures deterministic and stable edge values. The change increases the communication cost by roughly ~5% due to the larger exchange buffer.

Finally, note that a similar issue can also arise in serial runs with periodic boundary conditions, and this PR fixes that case as well.

Because of these changes, this PR represents a major refactoring of the MPI exchange routines and boundary conditions logic, that all now relies on pre-defined bounding boxes. The new edge zones are now handled only for internal and strictly periodic (i.e., non–shearing-box) boundary conditions, simplifying and unifying the overall approach.

@glesur glesur changed the base branch from master to develop November 27, 2025 13:25
  averaging) to be coherent with BXs exchange routine
- check that vector potential follow the same boundary logic as the EMF
  (left domain is the reference)

Note: BCs on the vector potential is only apply after boundary
conditions, since EMFs boundary conditions ensure that the vector
potential will always be consistent later.

Todo:
- enforce these boundary conditions with vector potential with
periodic boundary conditions and without MPI (as is done for EMFs)
- check that BXs normal is also consistent when MPI is off with periodic
  BCs
…s we should not reset the surface field in this case.
In the future, will allow exchange routines not to
exchange normal field last active zone when shearing box is enabled.

For this, the refactoring include an optional parameter "overwriteBXn"
in each exchange direction
overwrite BXs normal only with periodic boundary conditions to save the
shearing box
-fix gauge for AmbipolarCShock to be compatible with periodic boundary
conditions along x2
-fix missing fences with shearing box BCs
@glesur glesur added the bug Something isn't working label Dec 15, 2025
@glesur glesur requested a review from Copilot December 15, 2025 10:17
@glesur glesur self-assigned this Dec 15, 2025
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR resolves a critical issue causing face-centered magnetic field components (BXs) to drift at MPI subdomain boundaries during long simulations. The fix implements a consistent synchronization strategy where the left-side subdomain's BXs value overwrites the right neighbor's value, ensuring deterministic edge values. The solution also handles periodic boundary conditions in serial runs.

Key Changes

  • Major refactoring of MPI exchange routines with new Exchanger class and bounding box-based logic
  • EMF exchange now overwrites rather than averages to eliminate accumulated roundoff errors
  • Extended test coverage to include MPI and improved tolerance for numerical precision

Reviewed changes

Copilot reviewed 32 out of 33 changed files in this pull request and generated 5 comments.

Show a summary per file
File Description
src/mpi/*.{cpp,hpp} Complete refactoring: moved MPI/Buffer classes to dedicated directory with new Exchanger class
src/fluid/constrainedTransport/EMFexchange.hpp Changed EMF exchange from averaging to overwriting left-side values
src/fluid/constrainedTransport/enforceEMFBoundary.hpp Enabled ENFORCE_EMF_CONSISTENCY and simplified periodic boundary logic
src/fluid/boundary/boundary.hpp Added bounding box infrastructure for ghost regions
test/MHD/AmbipolarCshock3D/*.{cpp,ini} Updated test setup: fixed vector potential initialization and changed boundary conditions
test/MHD/AmbipolarCshock3D/testme.py Relaxed tolerance from 2e-14 to 3e-14
test/MHD/ShearingBox/testme.py Added MPI test coverage
Comments suppressed due to low confidence (2)

test/MHD/AmbipolarCshock3D/testme.py:1

  • The tolerance was relaxed from 2e-14 to 3e-14. While this may be necessary due to the new MPI exchange logic, ensure that this increased tolerance does not mask potential numerical accuracy issues. Consider documenting why this change was needed.
    test/MHD/AmbipolarCshock3D/setup.cpp:1
  • This debug output line is very long. Consider breaking it into multiple lines for better readability, or using formatted output with appropriate field widths.
#include "idefix.hpp"

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

glesur and others added 5 commits December 15, 2025 11:25
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
…:idefix-code/idefix into fixBxSConsistencyAccrossMPIDecomposition
@glesur glesur merged commit 3a8e37b into develop Dec 15, 2025
38 checks passed
@glesur glesur deleted the fixBxSConsistencyAccrossMPIDecomposition branch December 15, 2025 12:36
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants