Skip to content

Bug in anisotropic optimizer with recent update on numpy 2.4 #33

@PFLeget

Description

@PFLeget

@rmjarvis was reporting to me that after updating to numpy 2.4, treegp looks to fail unit test in Piff with this error

$ pytest test_gp_interp.py -k aniso
======================================= test session starts ========================================
platform darwin -- Python 3.12.2, pytest-7.4.3, pluggy-1.3.0
rootdir: /Users/Mike/rmjarvis/Piff
configfile: pyproject.toml
plugins: timeout-2.2.0, xdist-3.3.1, nbval-0.11.0, anyio-4.6.2
collected 3 items / 2 deselected / 1 selected                                                      

test_gp_interp.py F                                                                          [100%]

============================================= FAILURES =============================================
____________________________________ test_gp_interp_anisotropic ____________________________________

    @timer
    def test_gp_interp_anisotropic():
    
        if __name__ == "__main__":
            atol = 4e-2
            rtol = 3e-1
            nstars = [1600, 4000, 1600, 4000]
        else:
            atol = 4e-1
            rtol = 5e-1
            nstars = [160, 500, 160, 500]
    
        noise_level = 1e-4
    
        L1 = get_correlation_length_matrix(4., 0.3, 0.3)
        invL1 = np.linalg.inv(L1)
        L2 = get_correlation_length_matrix(20., 0.3, 0.3)
        invL2 = np.linalg.inv(L2)
    
        kernels = ["4e-4 * AnisotropicRBF(invLam={0!r})".format(invL1),
                   "4e-4 * AnisotropicRBF(invLam={0!r})".format(invL1),
                   "4e-4 * AnisotropicVonKarman(invLam={0!r})".format(invL2),
                   "4e-4 * AnisotropicVonKarman(invLam={0!r})".format(invL2)]
    
        optimizer = ['none',
                     'anisotropic',
                     'none',
                     'anisotropic']
    
        for i in range(len(kernels)):
    
            stars_training, stars_validation = make_gaussian_random_fields(
                    kernels[i], nstars[i], xlim=-20, ylim=20,
                    seed=30352010, vmax=4e-2,
                    noise_level=noise_level)
>           check_gp(stars_training, stars_validation, kernels[i],
                     optimizer[i], min_sep=0., max_sep=5., nbins=11,
                     l0=20., atol=atol, rtol=rtol, plotting=False)

test_gp_interp.py:385: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
test_gp_interp.py:187: in check_gp
    interp.solve(stars=stars_training, logger=None)
../piff/gp_interp.py:260: in solve
    self._fit(X, y, y_err=y_err, logger=logger)
../piff/gp_interp.py:177: in _fit
    self.gps[i].solve()
../../../miniforge3/envs/py3.12/lib/python3.12/site-packages/treegp/gp_interp.py:253: in solve
    self.kernel = self._fit(
../../../miniforge3/envs/py3.12/lib/python3.12/site-packages/treegp/gp_interp.py:136: in _fit
    kernel = self._optimizer.optimizer(kernel)
../../../miniforge3/envs/py3.12/lib/python3.12/site-packages/treegp/two_pcf.py:443: in optimizer
    robust.minimize_minuit(p0=self.p0_robust_fit)
../../../miniforge3/envs/py3.12/lib/python3.12/site-packages/treegp/two_pcf.py:186: in minimize_minuit
    self._minimize_minuit(p0=p0)
../../../miniforge3/envs/py3.12/lib/python3.12/site-packages/treegp/two_pcf.py:160: in _minimize_minuit
    self.m.migrad()
../../../miniforge3/envs/py3.12/lib/python3.12/site-packages/iminuit/minuit.py:789: in migrad
    fm = _robust_low_level_fit(
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

fcn = <iminuit._core.FCN object at 0x1342f7fb0>
state = <iminuit._core.MnUserParameterState object at 0x1342f5130>, ncall = 0
strategy = <iminuit._core.MnStrategy object at 0x1348770b0>, tolerance = 0.1, precision = None
iterate = 5, use_simplex = True

    def _robust_low_level_fit(
        fcn: FCN,
        state: MnUserParameterState,
        ncall: int,
        strategy: MnStrategy,
        tolerance: float,
        precision: Optional[float],
        iterate: int,
        use_simplex: bool,
    ) -> FunctionMinimum:
        # Automatically call Migrad up to `iterate` times if minimum is not valid.
        # This simple heuristic makes Migrad converge more often. Optionally,
        # one can interleave calls to Simplex and Migrad, which may also help.
        migrad = MnMigrad(fcn, state, strategy)
        if precision is not None:
            migrad.precision = precision
>       fm = migrad(ncall, tolerance)
E       RuntimeError: Unable to cast Python instance of type <class 'numpy.ndarray'> to C++ type 'double'

../../../miniforge3/envs/py3.12/lib/python3.12/site-packages/iminuit/minuit.py:2782: RuntimeError
===================================== short test summary info ======================================
FAILED test_gp_interp.py::test_gp_interp_anisotropic - RuntimeError: Unable to cast Python instance of type <class 'numpy.ndarray'> to C++ type 'double'
================================= 1 failed, 2 deselected in 1.35s ==================================

I need to reproduce this error and find what is going on.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions