First, the algorithm you cite is not from Rosenblatt (1958). Second, the cited algorithm is, at best, unclear, if not incorrect. What is s.t. supposed to mean? "subject to", as in constrained optimisation? What happens if there is more then one $i$ for which the if condition is satisfied? Does the algorithm ever stop?
I'll try to answer your question under the assumptions:
- that we are actually talking about the following algorithm:

and
- that you want to know what happens if the
if condition (line 5) is removed, so that the lines 6 and 7 are executed unconditionally:

In that case, the algorithm obviously never converges. But, as in each pass through the while loop exactly the same things happen, $\vec{w}$ changes by the same amount in each pass. Thus, in pass $t$, $\vec{w}(t) = t \cdot \vec{w}(1)$.
Note:
- The direction of $\vec{w}$ never changes after the first pass, only its size. Consequently, the number of wrong predictions w.r.t. number of passes will be flat (i.e. constant) after the first pass.
- In each pass, you add to $\vec{w}$ all elements $\vec{x}_i : y_i = +1$ and subtract all $\vec{x}_i : y_i = -1$. In other words:
$$
\vec{w}(t+1) = \vec{w}(t) + \sum_{i : y_i = +1} \vec{x}_i - \sum_{i : y_i = -1} \vec{x}_i
$$
Now, the first sum is obviously the mean of all $\vec{x}_i : y_i = +1$, weighted by the number of such $\vec{x}_i$. The same, but with $y_i = -1$, holds for the second sum. Consequently, $\vec{w}$ is the weighted difference between the means of the two classes.
The figure below shows two classes, the line (blue) connecting their means and the direction of $\vec{w}$ (dashed, orange):
