且构网

分享程序员开发的那些事...
且构网 - 分享程序员编程开发的那些事

当您想与目标函数一起计算梯度时,如何使用 scipy.optimize.minimize 函数?

更新时间:2022-12-01 14:46:55

你完全可以.只需使用 jac=True:

You totally can. Just use jac=True:

In [1]: import numpy as np

In [2]: from scipy.optimize import minimize

In [3]: def f_and_grad(x):
   ...:     return x**2, 2*x
   ...: 

In [4]: minimize(f_and_grad, [1], jac=True)
Out[4]: 
      fun: 1.8367099231598242e-40
 hess_inv: array([[ 0.5]])
      jac: array([  2.71050543e-20])
  message: 'Optimization terminated successfully.'
     nfev: 4
      nit: 2
     njev: 4
   status: 0
  success: True
        x: array([  1.35525272e-20])

它实际上是记录:

jac : bool 或 callable, 可选的 Jacobian (gradient) 目标功能.仅适用于 CG、BFGS、Newton-CG、L-BFGS-B、TNC、SLSQP、狗腿、信任-ncg.如果 jac 是布尔值且为 True,则假定 fun 返回梯度和目标函数. 如果为 False,梯度将进行数值估算.jac 也可以是一个可调用的返回目标的梯度.在这种情况下,它必须接受相同的争论很有趣.

jac : bool or callable, optional Jacobian (gradient) of objective function. Only for CG, BFGS, Newton-CG, L-BFGS-B, TNC, SLSQP, dogleg, trust-ncg. If jac is a Boolean and is True, fun is assumed to return the gradient along with the objective function. If False, the gradient will be estimated numerically. jac can also be a callable returning the gradient of the objective. In this case, it must accept the same arguments as fun.

(强调我的)