Submit
Path:
~
/
/
proc
/
thread-self
/
root
/
opt
/
alt
/
python35
/
lib64
/
python3.5
/
site-packages
/
scipy
/
optimize
/
File Content:
__init__.py
""" ===================================================== Optimization and root finding (:mod:`scipy.optimize`) ===================================================== .. currentmodule:: scipy.optimize Optimization ============ Local Optimization ------------------ .. autosummary:: :toctree: generated/ minimize - Unified interface for minimizers of multivariate functions minimize_scalar - Unified interface for minimizers of univariate functions OptimizeResult - The optimization result returned by some optimizers OptimizeWarning - The optimization encountered problems The `minimize` function supports the following methods: .. toctree:: optimize.minimize-neldermead optimize.minimize-powell optimize.minimize-cg optimize.minimize-bfgs optimize.minimize-newtoncg optimize.minimize-lbfgsb optimize.minimize-tnc optimize.minimize-cobyla optimize.minimize-slsqp optimize.minimize-dogleg optimize.minimize-trustncg The `minimize_scalar` function supports the following methods: .. toctree:: optimize.minimize_scalar-brent optimize.minimize_scalar-bounded optimize.minimize_scalar-golden The specific optimization method interfaces below in this subsection are not recommended for use in new scripts; all of these methods are accessible via a newer, more consistent interface provided by the functions above. General-purpose multivariate methods: .. autosummary:: :toctree: generated/ fmin - Nelder-Mead Simplex algorithm fmin_powell - Powell's (modified) level set method fmin_cg - Non-linear (Polak-Ribiere) conjugate gradient algorithm fmin_bfgs - Quasi-Newton method (Broydon-Fletcher-Goldfarb-Shanno) fmin_ncg - Line-search Newton Conjugate Gradient Constrained multivariate methods: .. autosummary:: :toctree: generated/ fmin_l_bfgs_b - Zhu, Byrd, and Nocedal's constrained optimizer fmin_tnc - Truncated Newton code fmin_cobyla - Constrained optimization by linear approximation fmin_slsqp - Minimization using sequential least-squares programming differential_evolution - stochastic minimization using differential evolution Univariate (scalar) minimization methods: .. autosummary:: :toctree: generated/ fminbound - Bounded minimization of a scalar function brent - 1-D function minimization using Brent method golden - 1-D function minimization using Golden Section method Equation (Local) Minimizers --------------------------- .. autosummary:: :toctree: generated/ leastsq - Minimize the sum of squares of M equations in N unknowns least_squares - Feature-rich least-squares minimization. nnls - Linear least-squares problem with non-negativity constraint lsq_linear - Linear least-squares problem with bound constraints Global Optimization ------------------- .. autosummary:: :toctree: generated/ basinhopping - Basinhopping stochastic optimizer brute - Brute force searching optimizer differential_evolution - stochastic minimization using differential evolution Rosenbrock function ------------------- .. autosummary:: :toctree: generated/ rosen - The Rosenbrock function. rosen_der - The derivative of the Rosenbrock function. rosen_hess - The Hessian matrix of the Rosenbrock function. rosen_hess_prod - Product of the Rosenbrock Hessian with a vector. Fitting ======= .. autosummary:: :toctree: generated/ curve_fit -- Fit curve to a set of points Root finding ============ Scalar functions ---------------- .. autosummary:: :toctree: generated/ brentq - quadratic interpolation Brent method brenth - Brent method, modified by Harris with hyperbolic extrapolation ridder - Ridder's method bisect - Bisection method newton - Secant method or Newton's method Fixed point finding: .. autosummary:: :toctree: generated/ fixed_point - Single-variable fixed-point solver Multidimensional ---------------- General nonlinear solvers: .. autosummary:: :toctree: generated/ root - Unified interface for nonlinear solvers of multivariate functions fsolve - Non-linear multi-variable equation solver broyden1 - Broyden's first method broyden2 - Broyden's second method The `root` function supports the following methods: .. toctree:: optimize.root-hybr optimize.root-lm optimize.root-broyden1 optimize.root-broyden2 optimize.root-anderson optimize.root-linearmixing optimize.root-diagbroyden optimize.root-excitingmixing optimize.root-krylov optimize.root-dfsane Large-scale nonlinear solvers: .. autosummary:: :toctree: generated/ newton_krylov anderson Simple iterations: .. autosummary:: :toctree: generated/ excitingmixing linearmixing diagbroyden :mod:`Additional information on the nonlinear solvers <scipy.optimize.nonlin>` Linear Programming ================== Simplex Algorithm: .. autosummary:: :toctree: generated/ linprog -- Linear programming using the simplex algorithm linprog_verbose_callback -- Sample callback function for linprog The `linprog` function supports the following methods: .. toctree:: optimize.linprog-simplex Assignment problems: .. autosummary:: :toctree: generated/ linear_sum_assignment -- Solves the linear-sum assignment problem Utilities ========= .. autosummary:: :toctree: generated/ approx_fprime - Approximate the gradient of a scalar function bracket - Bracket a minimum, given two starting points check_grad - Check the supplied derivative using finite differences line_search - Return a step that satisfies the strong Wolfe conditions show_options - Show specific options optimization solvers LbfgsInvHessProduct - Linear operator for L-BFGS approximate inverse Hessian """ from __future__ import division, print_function, absolute_import from .optimize import * from ._minimize import * from ._root import * from .minpack import * from .zeros import * from .lbfgsb import fmin_l_bfgs_b, LbfgsInvHessProduct from .tnc import fmin_tnc from .cobyla import fmin_cobyla from .nonlin import * from .slsqp import fmin_slsqp from .nnls import nnls from ._basinhopping import basinhopping from ._linprog import linprog, linprog_verbose_callback from ._hungarian import linear_sum_assignment from ._differentialevolution import differential_evolution from ._lsq import least_squares, lsq_linear __all__ = [s for s in dir() if not s.startswith('_')] from numpy.testing import Tester test = Tester().test bench = Tester().bench
Edit
Rename
Chmod
Delete
FILE
FOLDER
Name
Size
Permission
Action
__pycache__
---
0755
_lsq
---
0755
tests
---
0755
__init__.py
6526 bytes
0644
_basinhopping.py
26053 bytes
0644
_cobyla.cpython-35m-x86_64-linux-gnu.so
128896 bytes
0755
_differentialevolution.py
32558 bytes
0644
_group_columns.cpython-35m-x86_64-linux-gnu.so
141704 bytes
0755
_hungarian.py
9379 bytes
0644
_lbfgsb.cpython-35m-x86_64-linux-gnu.so
133600 bytes
0755
_linprog.py
37305 bytes
0644
_minimize.py
26435 bytes
0644
_minpack.cpython-35m-x86_64-linux-gnu.so
131488 bytes
0755
_nnls.cpython-35m-x86_64-linux-gnu.so
46656 bytes
0755
_numdiff.py
21209 bytes
0644
_root.py
26007 bytes
0644
_slsqp.cpython-35m-x86_64-linux-gnu.so
100568 bytes
0755
_spectral.py
7986 bytes
0644
_trustregion.py
8598 bytes
0644
_trustregion_dogleg.py
4449 bytes
0644
_trustregion_ncg.py
4646 bytes
0644
_tstutils.py
1322 bytes
0644
_zeros.cpython-35m-x86_64-linux-gnu.so
15760 bytes
0755
cobyla.py
9915 bytes
0644
lbfgsb.py
18069 bytes
0644
linesearch.py
24201 bytes
0644
minpack.py
30534 bytes
0644
minpack2.cpython-35m-x86_64-linux-gnu.so
47448 bytes
0755
moduleTNC.cpython-35m-x86_64-linux-gnu.so
40336 bytes
0755
nnls.py
1423 bytes
0644
nonlin.py
46681 bytes
0644
optimize.py
101006 bytes
0644
setup.py
3267 bytes
0644
slsqp.py
17983 bytes
0644
tnc.py
16533 bytes
0644
zeros.py
19912 bytes
0644
N4ST4R_ID | Naxtarrr