This question is regarding applying power transformations on input time series.
In literature the idea behind applying transformations to input time series is to stabilize variance. But I have seen that power transformations can make time series normally distributed.
The question is can we argue that for the special case of linear models (like ARIMA, prophet) making time series normally distirbuted means there is higher probability of getting residuals normally distributed. Therefore stablizing variance is not the only reason for applying transformations.
Or making input time series normal and stablizing variance are related to each other is some way.