The difference between Linear regression and Polynomial regression is that in the last we manipulate our original explanatory variables in a way to create polynomial dependency between Y and X. For the sake of simplicity if we consider one feature (explanatory variable) only: X, the Polynomial regression with degree 2 will look like this:
$Y$ = $\beta_0$ + $\beta_1\cdot X$ + $\beta_2\cdot X^2$
In sklearn we can do this by initializing and calling fit_transform method of PolynomialFeatures(2). Eventually if we want to train our model we should use LinearRegression of sklearn to do so.
However, one of the assumptions of the Linear Regression is that all features should be uncorrelated. $X$ and $X^2$, on the other hand are perfectly correlated. So, doesn't this create any issue with the training process?