• SciPy v0.19.0 Reference Guide
  • Optimization and root finding (scipy.optimize)
  • index
  • modules
  • next
  • previous

Logo

Previous topic

scipy.optimize.check_grad

Next topic

scipy.optimize.show_options

scipy.optimize.line_search¶

scipy.optimize.line_search(f, myfprime, xk, pk, gfk=None, old_fval=None, old_old_fval=None, args=(), c1=0.0001, c2=0.9, amax=50)[source]¶

Find alpha that satisfies strong Wolfe conditions.

Parameters:

f : callable f(x,*args)

Objective function.

myfprime : callable f’(x,*args)

Objective function gradient.

xk : ndarray

Starting point.

pk : ndarray

Search direction.

gfk : ndarray, optional

Gradient value for x=xk (xk being the current parameter estimate). Will be recomputed if omitted.

old_fval : float, optional

Function value for x=xk. Will be recomputed if omitted.

old_old_fval : float, optional

Function value for the point preceding x=xk

args : tuple, optional

Additional arguments passed to objective function.

c1 : float, optional

Parameter for Armijo condition rule.

c2 : float, optional

Parameter for curvature condition rule.

amax : float, optional

Maximum step size

Returns:

alpha : float or None

Alpha for which x_new = x0 + alpha * pk, or None if the line search algorithm did not converge.

fc : int

Number of function evaluations made.

gc : int

Number of gradient evaluations made.

new_fval : float or None

New function value f(x_new)=f(x0+alpha*pk), or None if the line search algorithm did not converge.

old_fval : float

Old function value f(x0).

new_slope : float or None

The local slope along the search direction at the new value <myfprime(x_new), pk>, or None if the line search algorithm did not converge.

Notes

Uses the line search algorithm to enforce strong Wolfe conditions. See Wright and Nocedal, ‘Numerical Optimization’, 1999, pg. 59-60.

For the zoom phase it uses an algorithm by [...].

  • © Copyright 2008-2016, The Scipy community.
  • Last updated on Mar 15, 2017.
  • Created using Sphinx 1.5.2.