0

I have two time series with time steps of 1. I suspect there is some small lag between those two. I want to bridge that gap. And The lag is smaller than 1, so acf doesn't help.

Is there any technique that allows to estimate lag inferior than a time step ? Is there any technique that allows to apply lag inferior than 1 to a time series ?

Lucas Morin
  • 103
  • 4

1 Answers1

1

Is there any technique that allows to apply lag inferior than 1 to a time series ?

There sure is: Highly recommended reading on the topic: https://ieeexplore.ieee.org/document/482137 Code can be found at http://legacy.spa.aalto.fi/software/fdtools/

Is there any technique that allows to estimate lag inferior than a time step ?

If the lag is constant you can just up-sample a "large enough" chunk of both sequences (using the SAME up-sampling method for both) and then run a cross correlation.

And alterative would be to estimate the lag in the frequency domain. Take a chunk of a signal, window it, and apply FFT and divide. Do a linear approximation of the phase function using "good" frequencies. If the fit is good you can use the slope as the delay. Basically you are assuming that $$Y(\omega) \approx X(\omega) e^{-j\omega \tau}$$

where $\tau$ is your lag.

If the lag is drifting with time, things are more complicated.

Hilmar
  • 44,604
  • 1
  • 32
  • 63
  • "If the lag is constant..." and if the contribution of any higher-frequency signals that have been aliased into baseband are either minimal, or are uncorrelated with the sampling, then yes, there would be sufficient information there. Such correlated aliases could come from something as simple as the signal passing through a memoryless nonlinearity before being sampled. The less the lag being measured, the less tolerance there would be for correlated aliases (and the longer observation interval that would be needed for uncorrelated aliases). – TimWescott Oct 23 '21 at 15:18