Skip to content

Conversation

@Kss2k
Copy link
Owner

@Kss2k Kss2k commented Nov 8, 2025

No description provided.

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Comment on lines 1289 to 1293

for (std::size_t k = 0; k < idxPsi.n_elem; ++k) {
const std::size_t pos = idxPsi[k];
grad[pos] = gradPsi(row[pos], col[pos]);
}

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Adjust analytic gradient for symmetric covariances

The analytic branch fills the gradient vector by reading a single matrix entry, e.g. grad[pos] = gradPsi(row[pos], col[pos]);, without taking symmetric[pos] into account. For blocks like psi, thetaDelta and omegaXiXi the optimiser treats each off‑diagonal parameter as a single value that simultaneously perturbs (i,j) and (j,i) (the finite‑difference fallback does this by updating both entries). Returning only the derivative for one matrix entry underestimates the directional derivative by a factor of two for off‑diagonal parameters, so gradients are inconsistent with the parameterization and can mislead gradient‑based optimisation whenever covariance matrices have free off‑diagonal elements. The analytic mapping should combine the symmetric pair before writing to grad.

Useful? React with 👍 / 👎.

@Kss2k Kss2k marked this pull request as draft January 6, 2026 09:20
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants