The parameters of the simple linear regression model $B = \alpha + \beta A + \varepsilon$ can be obtained from the standard deviations of both variables $s_A$ and $s_B$, the correlation coefficient $r_{AB}$, and the means $\bar{A}$ and $\bar{B}$
$$\begin{align}
\hat\beta &= r_{AB} \frac{s_B}{s_A} \\
\hat\alpha &= \bar{B} - (\hat\beta \bar{A})
\end{align}$$
in such a case, you can use linear regression to make predictions
$$
\hat B = \hat\alpha + \hat\beta A
$$
If you don't have the additional information, the correlation coefficient is unitless so it wouldn't allow you to make the predictions directly. If the absolute value of the correlation is high, you would know that $B$ changes "much" with $A$, but you wouldn't know how much exactly. That is why we need the standard deviations. The means are needed for bias correction, without it, your predictions might be off by a constant $\alpha$.
Notice that the above assumes that the relationship between $A$ and $B$ is linear. If it is not linear and cannot be reasonably approximated by a linear function, the above approach would not work. The approach also assumes that the errors are independent, if you are dealing with some kind of time-series data it might not be the case and the results might be off.