two Wolfe’s conditions are used in inexact line search to determine an appropriate step size that balances between making significant progress in minimizing the objective function while maintaining stability. They lead to faster convergence than fixed step size methods, but is more computationally expensive than backtracking line search

Conditions

Armijo’s Condition: Sufficient decrease condition

This condition guarantees that the step size results in a meaningful reduction of the objective function, avoiding taking steps that are too small, which could lead to very slow convergence.

  • is the step size.
  • is the descent direction (e.g., negative gradient).
  • represents the directional derivative (inner product of the gradient with the descent direction, essentially the slope at
  • is a constant that controls the strictness of the condition.

Curvature condition

This condition ensures that the step size is not too large, preventing overshooting the minimum. It requires that the magnitude of the gradient after moving to the new point, , is not too steep in the descent direction.

Where:

  • is the gradient after taking the step.
  • is a constant that controls how much of the slope is preserved.

Checking for curvature for a complicated function is difficult, so backtracking line search (which checks only the first condition) is preferred.