Chapter 5: SciPy Optimizers

SciPy optimizers

This is one of the most powerful and most-used submodules in all of SciPy. If you’ve ever needed to find the best parameters, fit a model to data, solve nonlinear equations, or minimize a cost function, this is where the magic lives.

Think of scipy.optimize as your Swiss Army knife for finding minima (or maxima) of functions, roots of equations, and more — all in a clean, well-tested Python interface.

First — What does “optimizers” mean here?

In scientific computing, optimization usually means:

  • Find the values of variables (x, y, parameters…) that make a function as small as possible (minimization) → most common case
  • Or as large as possible (maximization — just flip the sign)

scipy.optimize handles both unconstrained and constrained problems, with or without derivatives, local & global search, linear & nonlinear, scalar & vector-valued objectives, etc.

Current status (Feb 2026): SciPy 1.17.0 is the latest stable release (released Jan 2026). The optimization tools are very mature — many algorithms come from decades-old, battle-tested Fortran/C libraries, wrapped beautifully.

Main entrance doors — the functions you’ll use 90% of the time

What you want to do Main function / class Best for / When to reach for it Typical import style
Minimize a scalar function of many variables minimize() General-purpose minimization (your daily driver) from scipy.optimize import minimize
Fit model parameters to data (curve fitting) curve_fit() The single most used function in labs & papers from scipy.optimize import curve_fit
Nonlinear least-squares least_squares() When you have residuals (errors), bounds, robust loss from scipy.optimize import least_squares
Find roots of nonlinear equations root() / root_scalar() Solve f(x) = 0 (systems or single variable) from scipy.optimize import root, root_scalar
Global optimization differential_evolution(), shgo(), dual_annealing() When local minima are a problem, expensive black-box fcn from scipy.optimize import differential_evolution
Linear programming linprog() Optimize linear objective with linear constraints from scipy.optimize import linprog

1. The superstar: scipy.optimize.minimize()

This is the unified interface — you pick a method and it calls the right solver.

Signature (simplified):

Python

Popular methods (the ones people actually remember and use):

Method string Needs gradient? Handles bounds? Handles constraints? Best for / Notes
‘Nelder-Mead’ No Yes (simple) No Derivative-free, robust, classic simplex method
‘BFGS’ Auto or provide No No Quasi-Newton, excellent default for smooth functions
‘L-BFGS-B’ Auto or provide Yes No Memory-efficient BFGS + box bounds
‘TNC’ Auto or provide Yes No Truncated Newton with bounds
‘SLSQP’ Auto or provide Yes Yes (ineq + eq) Sequential Least Squares — good all-rounder
‘trust-constr’ Recommended Yes Yes Modern, reliable for constrained problems
‘COBYLA’ No No Yes (ineq only) Constraint optimization by linear approximation
‘Powell’ No No No Derivative-free, good for noisy functions

Quick realistic example — minimize the Rosenbrock function (the banana-shaped classic test function)

Python

Typical output:

text

→ BFGS/L-BFGS-B usually win on smooth functions like this.

2. The absolute most-used function: curve_fit()

Fits a model y = f(x; params) to data points (xdata, ydata)

Python

→ This is what 80% of experimental scientists/engineers use daily.

3. Quick root finding example

Python

Teacher’s cheat sheet — which one to pick first?

  • Have data + want to fit model? → curve_fit
  • General smooth function minimization, no constraints? → minimize(…, method=’BFGS’ or ‘L-BFGS-B’)
  • Need bounds? → ‘L-BFGS-B’ or ‘trust-constr’
  • No derivatives available? → ‘Nelder-Mead’ or ‘Powell’
  • Global minimum needed? → try differential_evolution first
  • Nonlinear equation(s) f(x)=0? → root or root_scalar
  • Linear problem? → linprog

The official tutorial is excellent: https://docs.scipy.org/doc/scipy/tutorial/optimize.html

Which kind of optimization problem are you facing right now (or planning to)? Curve fitting noisy data? Constrained minimization? Global search? Root of a system?

Tell me and we’ll build a detailed, copy-paste-ready example tailored to that. 🚀

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *