I went through this thread on degrees of freedom: How to understand degrees of freedom?, and the great answers in it, but then I read the following on Wikipedia on the article about regression:
Statistical assumptions
When the number of measurements, N, is larger than the number of unknown parameters, k, and the measurement errors εi are normally distributed then the excess of information contained in (N − k) measurements is used to make statistical predictions about the unknown parameters. This excess of information is referred to as the degrees of freedom of the regression.
Given this definition, if $N$ increases, the degrees of freedom increase as well, but intuitively that would make the problem more constrained (we have more information per parameter). Why is N-k then called degrees of freedom, and it isn't the other way around e.g. (k-N)?
Degree of Freedomon StatEx I guess I was a bit lost. I'll take back the phrasefull-fledgedand I am sure I can't provide a good enough answer that doesn't fall short of your expectation and is not prone to errors (if not a lot). You are right, after I failed to grasp what whuber's explained in his answer, I started doing my own research. I guess I have become relying too much on StatEx and got lazy :P – stucash Jul 25 '20 at 06:29StackwithStat). I am terribly sorry. – stucash Jul 25 '20 at 06:43DFas in why it is not an easy task to even search for a good explanation.DFalmost always needs the correct context to be, I'd say meaningful. Almost every page on google I clicked on forDFhas its own nuances in some way.. – stucash Jul 25 '20 at 06:47