0

I want to present time series data from a R analysis using flexdashboard. At the moment, the dataset is unnecessarily large. The obvious solution is to downsample the dataset, which seems like it should be fundamental. But apparently not? Surely there is a function in the standard packages that does this. Do data scientists have some wierd name for down-sampling or decimation? To be crystal clear, I'm talking about downsamping, in the sense of signal processing. It is not a random sample. It is an explicit change in the sample rate of the data set. It is achieved by disgaurding all but every nth element to achieve a downsample rate of n.

The only function I did find (downsampling) seems to do something entirely different. Its about balancing the sampling of a mixed data set.

??downsampling
??decimation
??downsample

All give me nothing. How to achieve this. I assume this is not a commonly used function in R, which surprises me.

To whoever closed the question, saying it has already been answered, downsampling is NOT a random operation. So the answer you link will not be correct. Would you mind removing your input.

Edit

While I was typing the edit, someone has added another link to an existing answer which is actually correct. But the question and answer neither have any reference to downsampling, so I'm so impressed they found it.

monkey
  • 805
  • 8
  • 27
  • 1
    If the `downsampling` you meant is in the signal processing domain, then maybe you can try `v[seq_along(v)%%10 == 1]` or `v[seq(1, length(v), by = 10)]` – ThomasIsCoding Sep 04 '21 at 07:54

0 Answers0