Polynomial Regression models the relationship between predictors and outcome using an degree polynomial function. To do so, Polynomial Regression uses squares of variables as additional predictors; the form is generally the same as regression

The most common polynomial regression is a parabolic model of the mean. In others words, mean is a linear function of the feature

where the parameters can be further modeled by some distribution like normal distribution. They are tested using prior-predictive distribution

Standardize features before fitting Polynomial Regression

Code example

degree of features: the number of features added. a degree of 3 will add 2 new variables for each input variable

import numpy as np
import pandas as pd
from sklearn.linear_model import LinearRegression
from sklearn.preprocessing import PolynomialFeatures
 
# Assign the data to predictor and outcome variables
train_data = pd.DataFrame(np.array([[-0.33532,6.66854],
					[0.02160,3.86398],
					[-1.19438,5.16161],
					[-0.65046,8.43823],
					[-0.28001,5.57201],
					[1.93258,-11.13270],
					[1.22620,-5.31226],
					[0.74727,-4.63725],
					[3.32853,3.80650],
					[2.87457,-6.06084],
					[-1.48662,7.22328],
					[0.37629,2.38887],							
					[1.43918,-7.13415], 
					[0.24183,2.00412], 
					[-2.79140,4.29794],
					[1.08176,-5.86553],
					[2.81555,-5.20711],
					[0.54924,-3.52863],
					[2.36449,-10.16202],
					[-1.01925,5.31123]]),
					columns = ["Var_X", "Var_Y"])
 
X = train_data["Var_X"].values.reshape(-1, 1)
y = train_data["Var_Y"].values
 
# Create polynomial features
poly_feat = PolynomialFeatures(degree=4)
X_poly = poly_feat.fit_transform(X)
 
# Make and fit the polynomial regression model
poly_model = LinearRegression(fit_intercept = False).fit(X_poly, y)
poly_model.coef_
 
#output: array([ 3.37563501, -6.28126025, -2.3787942 , 0.55307182, 0.22699807])