Skip to content

Conversation

@nyo16
Copy link
Contributor

@nyo16 nyo16 commented Jan 1, 2026

Summary

This PR adds Scholar.Optimize.Brent, implementing Brent's method for scalar function minimization. This is the second optimization algorithm following the Golden Section merge in #327, continuing the incremental approach discussed in #323.

Brent's method combines the reliability of golden section search with faster convergence from parabolic interpolation, making it the recommended choice for scalar optimization (as noted in https://docs.scipy.org/doc/scipy/tutorial/optimize.html).

Changes

  • lib/scholar/optimize/brent.ex - Pure defn implementation of Brent's method
  • test/scholar/optimize/brent_test.exs - Comprehensive test suite (11 tests + 2 doctests)
  • notebooks/optimize.livemd - Updated to present Brent as recommended method with performance comparisons
  • agents.md - Documents best practices from @josevalim's review feedback on Add Golden Section optimization algorithm #327

Features

  • JIT/GPU compatible - Implemented as pure defn with while loops and Nx.select
  • Same API as Golden Section - Brent.minimize(a, b, fun, opts \ [])
  • Significantly faster convergence - ~3-5x fewer function evaluations than Golden Section

Performance Comparison

Function Bracket Brent Golden Section
(x-3)² [0, 5] ~8 evals ~45 evals
(x-50)² [0, 100] ~10 evals ~50 evals
sin(x) [3, 5] ~11 evals ~42 evals

Example

alias Scholar.Optimize.Brent

fun = fn x -> Nx.pow(Nx.subtract(x, 3), 2) end
result = Brent.minimize(0.0, 5.0, fun)

Nx.to_number(result.x) # => 3.0
Nx.to_number(result.fun_evals) # => ~8 (vs ~45 for Golden Section)

Implementation Notes

Following the patterns established in #327 and @josevalim's feedback:

  • defn entry point (not deftransform)
  • Bounds as explicit function arguments
  • while loop with state map (not recursion)
  • Nx.select for all conditionals (no Nx.to_number in defn)
  • u32 for non-negative counters

I also added agent.md with the learnings from the last 2 PRs. It really helping the models to follow better the best practices.

References

nyo16 and others added 4 commits January 1, 2026 09:18
Implements Brent's method for scalar function minimization, combining
golden section search with parabolic interpolation for faster convergence.

Key features:
- Pure defn implementation (JIT/GPU compatible)
- ~3-5x fewer function evaluations than Golden Section
- Same API pattern as GoldenSection.minimize/4

Also updates notebooks/optimize.livemd to present Brent as the
recommended method with performance comparisons.

Adds agents.md documenting best practices from José Valim's review
feedback on PR elixir-nx#327 for future optimization algorithm contributors.
@josevalim josevalim merged commit e285ad6 into elixir-nx:main Jan 11, 2026
1 of 2 checks passed
@josevalim
Copy link
Contributor

@nyo16 can you please send a separate PR for agents.md? Thank you!

@josevalim
Copy link
Contributor

💚 💙 💜 💛 ❤️

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants