0

What I'm trying to do is to construct a linear model in a form like

$$ Y = \beta_0X_0-\beta_1X_1+\beta_2X_2 + \beta_3 $$

where $\beta_0$, $\beta_1$ and $\beta_2$ are coefficient of predictors $X_0$, $X_1$ and \beta_2 respectively. And, They all are positive, which means my assumption of the model is that $X_1$ has negative effect towards the response $Y$. Perhaps, I misunderstand the concept of regression, but if anyone has an idea how to achieve this in R, please enlighten me. Or any other approach apart from regression model. Thanks in advance.

  • 1
    The usual approach is to fit a model in your predictors and let the data tell you whether coefficients are positive or negative. Why would you want anything else? It's harder to test an assumption satisfied by force. By the way, you need very strong grounds not to include an intercept. – Nick Cox Nov 26 '14 at 02:58
  • Related: http://stats.stackexchange.com/questions/104890/is-it-possible-in-r-or-in-general-to-force-regression-coefficients-to-be-a-cer?rq=1 – Andrew M Nov 26 '14 at 06:03
  • If you really want to check an assumption that your coefficient is negative, you could use Bayesian estimation with an informative prior "pointing" in negative direction (your prior assumptions that you want to check) and compare the results obtained with estimation using a weakly informative prior. – Tim Nov 26 '14 at 08:05
  • 1
    Frequentists too are very happy that they have machinery to test hypotheses (or produce confidence intervals) which bear on the sign of coefficients. – Nick Cox Nov 26 '14 at 09:56

0 Answers0