Say theory tells me that
$$ y = f(x_1,x_2|\theta_f) $$
where $\theta_f$ is a set of parameters
Similarly, theory tells me that
$$ y = g(x_1,x_3|\theta_g) $$
where $\theta_g$ is another set of parameters. There might be common parameters between the two sets.
Estimating both equations is informative. Yet, I can combine them in order to produce a third equation:
$$ y = h(x_2,x_3|\theta_h)$$
with $\theta_h$ being a combination of the other two sets.
Can I also estimate the third equation, in order to add further overidentifying restrictions to the problem? Or is the third equation uninformative? How can you tell what is informative and what is not?
For example, imagine you want to estimate a production function of the like:
$$ y = x^a_1 x^b_2 $$
Say theory tells you that the first order condition wrt $x_1$ is equal to the known price of the factor, $p_1$. This is:
$$ p_1 = \frac{\partial y}{\partial x_1} = a\frac{y}{x_1} $$
Thus, we have a new equation:
$$ y = a p_1 x_1 $$
The system is now overidentified because I can estimate two parameters than are functions of $a$.
Furthermore, combining the two equations gives
$$ y = a^{\frac{-a}{1-a}} p_1^{\frac{-a}{1-a}} x_2^{\frac{b}{1-a}} $$
Thus, my system has now three equations:
$$ y = x^a_1 x^b_2 $$ $$ y = a p_1 x_1 $$ $$ y = a^{\frac{-a}{1-a}} p_1^{\frac{-a}{1-a}} x_2^{\frac{b}{1-a}} $$
where I have even more overidentifying restrictions.
Even more, someone might combine the first two through $y$, in order to produce
$$ p_1 = a x_1^{a-1} x_2^b $$
which can also be estimated (?), producing further overidentification restriction.
The question is then, when is this process informative? Is the definition of "informative" about producing independent (valid?) equations for the parameters?