Is there any technique that allows to apply lag inferior than 1 to a time series ?
There sure is:
Highly recommended reading on the topic: https://ieeexplore.ieee.org/document/482137 Code can be found at http://legacy.spa.aalto.fi/software/fdtools/
Is there any technique that allows to estimate lag inferior than a time step ?
If the lag is constant you can just up-sample a "large enough" chunk of both sequences (using the SAME up-sampling method for both) and then run a cross correlation.
And alterative would be to estimate the lag in the frequency domain. Take a chunk of a signal, window it, and apply FFT and divide. Do a linear approximation of the phase function using "good" frequencies. If the fit is good you can use the slope as the delay. Basically you are assuming that
$$Y(\omega) \approx X(\omega) e^{-j\omega \tau}$$
where $\tau$ is your lag.
If the lag is drifting with time, things are more complicated.