I'm working with data where I can expect both multiplicative error and additive error sources in a linear model. In an ideal world I'd like to be able to say that I'm looking an additive error which is normal and has variance $a$ and multiplicative error which is normal and has variance $by$ so my variance at a given predictor level $x$ is $a+by^2$.
I've done some looking and I can't come up with a good way of dealing with this. Going back to the basics and optimizing over $\beta_0$, $\beta_1$, $a$, and $b$ using z-score as a proxy for likelihood would fail to converge as it would push $a$ and $b$ infinitely large, and I'm concerned that if most of my data has $y>1$ that optimizing over the ratio of $a/b$ that similar issues would push me to zero and not for the reasons I want. Is there a better way to compute likelihood that would help me? Is what I'm trying to do just not tractable?