overfitting often comes from having too many parameters. There are generally two types of regularization: reducing the impact of parameters (Ridge) or zeroing the parameters (Lasso)
Mathematically, it adds a penalty term to the cost function of regular regression, which is the sum of absolute value of coefficients multiplied by a regularization parameter .
Lasso can shrink the slope to 0 and encourage such zero coefficients (visualization)
Mathematically, it adds a penalty term to the cost function of regular regression, which is the sum of squared coefficients multiplied by a regularization parameter .
Increasing the regularization parameter reduces overfitting by reducing the size of the parameters. For some parameters that are near zero, this reduces the effect of the associated features. However, extremely large might lead to underfitting
In contrast, a very small can leave overfitting unsolved.
Try different values for , each doubling the previous: