Suppose we want to minimize some convex functional $J(u)$ where $u$ lives in some Banach space $V$. The classical Newton method
$$\mathrm d^2J(u_n)(u_{n + 1} - u_n) = -\mathrm dJ(u_n)$$
can be viewed as an explicit Euler discretization of the dynamical system
$$\tau\;\mathrm d^2J(u)\frac{\mathrm du}{\mathrm dt} = -\mathrm dJ(u),$$
that takes a constant timestep $\tau$ at every step. You can think of a primitive damped Newton method as taking a timestep that is some fraction of $\tau$, and a globally-convergent Newton line search method as using a variable timestep that is tuned to make sure that the objective decreases. Several papers have remarked on this (see below).
Are there improved, practical solution methods based on viewing Newton's method as a dynamical system? For example, what happens if, instead of doing a line search at every iteration, you were to use a classical adaptive Runge Kutta method with stepsize control like RK-1/2 or similar? More generally, is there some respect in which this dynamical systems viewpoint is practically relevant, or is it just a mathematical curiosity?
References: