2

What are some good books or other material that explains how we can do analysis (hypothesis testing, confidence intervals, consistency) of a regression with constrains. For example, $$y \sim \alpha x_1 + \beta x_2$$ subject to$$f(\alpha) + g(\beta) = 1.$$ A priori, I am also interested in non-linear regressions. I am aware of the following pdf about regression + Lagrange multiplier, but I was more interested in a book.

  • 1
    Standard analysis wouldn't change, only portion of the analysis would be to use constrained optimisers in finding the values of $\alpha, \beta$, i.e., quadratic programming. – patagonicus May 22 '22 at 10:05
  • 1
    It depends on the kind of constraint. A linear equality like the example here doesn't change anything. You can see this by substituting $\beta=1-\alpha$ in the model, which yields a comparable model with one less parameter. What matters are whether (1) any constraints are nonlinear and (2) whether any are inequalities. Finally, "non-linear regression" has several different meanings: please see https://stats.stackexchange.com/questions/148638 for an account of the distinctions. – whuber May 22 '22 at 12:37
  • Thank you. But how exactly can I perform statistical inference under non-linear constraint on paramteres? – dart_kaide May 22 '22 at 17:03
  • If you want to study the problem in such generality, then seek out books on constrained optimization. – whuber May 22 '22 at 20:40

0 Answers0