I'm trying to compare the statistical evaluation of neural network and linear regression. From some articles I found, the DOF increases with model complexity for neural network
"Degrees of freedom, defined as parameter counts, have been frequently used in model selection."-- https://auai.org/uai2016/proceedings/papers/257.pdf
https://deeplearning.web.unc.edu/files/2016/10/Degrees-of-Freedom-in-Deep-Neural-Networks-PPT.pdf
"It is much smaller than the total number of parameters..." -- https://link.springer.com/chapter/10.1007/978-3-540-70981-7_26
while it decreases with model complexity in linear regression (number of data - number of terms), which is confirmed by many sources. "In general, the degrees of freedom of an estimate of a parameter are equal to the number of independent scores that go into the estimate minus the number of parameters used as intermediate steps in the estimation of the parameter itself "--https://en.wikipedia.org/wiki/Degrees_of_freedom_(statistics)
So given the same number of training data, larger DOF means less complexity for linear regression and more complexity for neural network. Why do they have different meanings of the same terminology? This makes it harder to compare the two modeling methods.