“Scipy”的版本间差异
跳到导航
跳到搜索
无编辑摘要 |
无编辑摘要 |
||
第1行: | 第1行: | ||
http://www.scipy-lectures.org |
http://www.scipy-lectures.org |
||
==求函数最小值== |
|||
*Methods based on conjugate gradient are named with ‘cg’ in scipy. The simple conjugate gradient method to minimize a function is scipy.optimize.fmin_cg(): |
|||
*n scipy, scipy.optimize.fmin() implements the Nelder-Mead approach: (不太依赖于倒数) |
|||
*Brute force: a grid search |
|||
:scipy.optimize.brute() evaluates the function on a given grid of parameters and returns the parameters corresponding to the minimum value.The parameters are specified with ranges given to numpy.mgrid. By default, 20 steps are taken in each direction: |
|||
*Non-linear least squares: Levenberg–Marquardt algorithm implemented in scipy.optimize.leastsq(). |
|||
The parameters are specified with ranges given to numpy.mgrid. By default, 20 steps are taken in each direction: |
|||
*If the function is linear, this is a linear-algebra problem, and should be solved with scipy.linalg.lstsq(). |
2017年10月1日 (日) 12:58的版本
求函数最小值
- Methods based on conjugate gradient are named with ‘cg’ in scipy. The simple conjugate gradient method to minimize a function is scipy.optimize.fmin_cg():
- n scipy, scipy.optimize.fmin() implements the Nelder-Mead approach: (不太依赖于倒数)
- Brute force: a grid search
- scipy.optimize.brute() evaluates the function on a given grid of parameters and returns the parameters corresponding to the minimum value.The parameters are specified with ranges given to numpy.mgrid. By default, 20 steps are taken in each direction:
- Non-linear least squares: Levenberg–Marquardt algorithm implemented in scipy.optimize.leastsq().
- If the function is linear, this is a linear-algebra problem, and should be solved with scipy.linalg.lstsq().