I am studying a research paper on iterative methods to compute generalized inverses of an arbitrary matrix $A$. I am studying the following iterative method:
$$Y_{k+1} = Y_{k} + Y_{k}(I - AY_{k}),$$ given an initial approximation $Y_{0}$. I am having difficulties understanding how to measure the computational time taken by this method over 15 iterations. I made MATLAB code and used tic and toc to compute this. But every time I am clicking on run program it is showing me different computational time values.
Why is the computational time different on different runs?
Here is my code
A = [1 4 0
2 3 0
2 0 1
0 0 0]; % given matrix
Y0 = [0.0101 0.0202 0.0202 0
0.0404 0.0303 0 0
0 0 0.0101 0]; % initial approximation
I = eye(4);
tic
for n=1:15
Y1 = zeros(4,3);
% compute sequence of approximations
Y1 = Y0+Y0*(I-A*Y0)
Y0 = Y1;
end
toc