backtracking line search is an inexact line search method that finds optimal step size by iteratively checking for the first Wolfe condition, sufficient decrease condition aka Armijo’s condition

Algorithm

Inputs , gradient , directions of descent , initial step size , constant

Iteratively reduce the step size by multiplying by the constant until the sufficient decrease condition is met

where is a constant that controls the strictness of the sufficient decrease condition (close to 0 means stricter).

Notations: Armijo’s Condition Sufficient decrease condition

Output:

Code

import numpy as np
 
def backtracking_line_search(f, grad_f, x_i, d, alpha=1.0, p=0.5, c=1e-4):
    """
    Backtracking line search algorithm to find the optimal step size alpha.
    
    Inputs:
    - f: Objective function
    - grad_f: Gradient of the objective function
    - x_i: Current point (numpy array)
    - d: Descent direction (numpy array)
    - alpha: Initial step size (default is 1.0)
    - p: Scaling factor to reduce alpha (default is 0.5)
    - c: Constant for the Armijo condition (default is 1e-4)
    
    Output:
    - alpha: Step size that satisfies the Armijo condition
    """
    
    while f(x_i + alpha * d) > f(x_i) + c * alpha * np.dot(grad_f(x_i), d):
        alpha *= p  # Reduce alpha by factor p
        
    return alpha
 
def f(x):
    return x[0]**2 + x[1]**2  # f(x, y) = x^2 + y^2
 
def grad_f(x):
    return np.array([2*x[0], 2*x[1]])  # Gradient of f(x, y) = [2x, 2y]
 
x_i = np.array([2, 1])
d = -grad_f(x_i)
 
optimal_alpha = backtracking_line_search(f, grad_f, x_i, d)
print(f"Optimal alpha: {optimal_alpha}")