I am trying to understand an example in my stats course notes, the example relates to calculating the best value for the next experiment.
The function of the line is very simple:
$$\ln(Y_i) = \ln(\theta^*_1 + \theta^*_2x_i) + \epsilon_i$$
The example works on a previous example where point estimates have already been obtained for $\theta^*_1$ and $\theta^*_2$ These are:
$$\theta = \begin{pmatrix} 3.9963\\ 2.3792 \end{pmatrix}$$
The values given already are:
i y x
1 4.11 0
2 6.32 1
3 8.21 2
4 10.43 3
5 14.29 4
6 16.78 5
So the question is, how to get the optimal value for $x_7$.
The equation that needs to be used is given, but doesn't make a lot of sense to me, particularly the last line, where some seemingly random numbers appear from nowhere.
Here are the notes provided:
Let the $p\times 1$ vector $$r_{n+1} = \frac{\delta f(x_{n+1}. \theta)}{\delta\theta}$$
$$ C_{n+1} = X_{n+1}'X_{n+1} = \begin{pmatrix}X_n\\ r_{n+1}' \end{pmatrix}' \begin{pmatrix}X_n\\ r_{n+1}' \end{pmatrix} $$
$$ \phi = |C_{n+1}| = |\begin{pmatrix}X_n\\ r_{n+1}' \end{pmatrix}' \begin{pmatrix}X_n\\ r_{n+1}' \end{pmatrix}| = |X_n'X_n+r_{n+1}r_{n+1}'| $$ $$ \therefore \phi = |C_{n+1} + r_{n+1}r_{n+1}'| = |C_n|(1 + r_{n+1}'C_n^{-1}r_{n+1}) $$
Here comes the example "at which value of $x, 0 \leq x \leq 2$ should the next experiment be carried out"
$$\ln(Y_i) = \ln(\theta^*_1 + \theta^*_2x_i) + \epsilon_i$$
$$\theta = \begin{pmatrix} 3.9963\\ 2.3792 \end{pmatrix}$$
$$ X'X = \begin{bmatrix}0.11777 & 0.11660\\ 0.11660 & 0.33500 \end{bmatrix} $$
$$ r_7' = \begin{bmatrix} \frac{1}{\theta_1 + \theta_2 x_7} & \frac{x_7}{\theta_1 + \theta_2 x_7}\end{bmatrix} $$
Now, while I don't exactly understand all this - I can work out what its saying. What gets me is the next bit.
$$ \phi = \frac{12.9358 - 8.9781x_7 + 4.534x_7^2}{(3.9963 + 2.3792x_7)^2} $$
Where did 12.9358, 8.9781, 4.534 come from?
Also, where did the division part of this come from, above $\phi$ was calculated without any division at all.
Finally, where did the $^2$ come from on the divisor?
Any help here would be greatly appreciated, I get the feeling its going to be very simple - I'm just missing something.
Thanks
EDIT After much deliberation, I have figured out this:
$$ \phi = \frac{X'X^{-1}_{00} + (X'X^{-1}_{0,1} + X'X^{-1}_{1,0})x_7 + X'X^{-1}_{1,1}x_7^2}{(\theta_0 + \theta_1x_7)^2} $$
It would be really fantastic if someone could confirm that this is the case for ALL non linear regressions, or if this is only the case for this one, and if it is something different for other equations.