1

I'm working with data where I can expect both multiplicative error and additive error sources in a linear model. In an ideal world I'd like to be able to say that I'm looking an additive error which is normal and has variance $a$ and multiplicative error which is normal and has variance $by$ so my variance at a given predictor level $x$ is $a+by^2$.

I've done some looking and I can't come up with a good way of dealing with this. Going back to the basics and optimizing over $\beta_0$, $\beta_1$, $a$, and $b$ using z-score as a proxy for likelihood would fail to converge as it would push $a$ and $b$ infinitely large, and I'm concerned that if most of my data has $y>1$ that optimizing over the ratio of $a/b$ that similar issues would push me to zero and not for the reasons I want. Is there a better way to compute likelihood that would help me? Is what I'm trying to do just not tractable?

1 Answers1

0

This is a difficult problem, but I know of two ways you might proceed:

1) Attempt a Bayesian approach. That is, specify a prior for $a$ and $b$ and any other parameters, and determine the posterior distribution of those given the data (you will probably have to use a Monte Carlo technique to do that.)

2) Penalised regression. Attach a penalty to the size of $a$ and $b$ to stop them diverging in the fitting process. You might be able to modify LASSO to suit your needs.

JDL
  • 1,394