step size is a component in many numerical approximation method, including descent method for a function . It can vary by iteration of the mentioned method.

The step size is commonly called the learning rate when it is a fixed step length or it scales in a way that is not dependent on the function . learning rate is common in gradient descent.