Let $x^{\ast}$ be the solution of minimizing $f(x)+\lambda R(x)$, where $R(x)$ is the regularization term and $\lambda > 0$.
Now consider the minimization problem, where you want to minimize $f(x)$ given $R(x)\le k$ with $k > 0$. Set $k=k^{\ast}:=R(x^{\ast})$. Let $x^{\prime}$ be the solution of this particular minimization problem. This means of course that $f(x^{\prime})\le f(x^{\ast})$, and also that it holds $R(x^{\prime})\le k^{\ast}=R(x^{\ast})$ or equivalently $\lambda R(x^{\prime})\le \lambda R(x^{\ast})$.
But if one of the two inequalities $f(x^{\prime})\le f(x^{\ast})$ and $\lambda R(x^{\prime}) \le \lambda R(x^{\ast})$ where strict, than this would be a contradiction to $x^{\ast}$ being a minimizer of $f(x)+\lambda R(x)$. Thus, $f(x^{\prime}) = f(x^{\ast})$ and $x^{\ast}$ is also a solution for the second minimization problem, and $x^{\prime}$ is also a solution for the first minimization problem.
This shows, that solving the minimization problem of the first kind solves also a particular minimization problem of the second kind, and vice versa.
The relationship is $k=R(x^{\ast})$.