I have a sensor producing (more or less) bandlimited data with a cut-off of about 45Hz, with a roll-off and AWGN. I have an ADC that samples said signal at 800Hz, with a single-pole anti-aliasing filter at about 200Hz. The problem is, I only have enough communication bandwidth to send samples at 100Hz and therefore some decimation is necessary.
Currently, I simply have an 8-sample moving average filter and send every 8th sample. This feels dirty and suboptimal. Surely there must be a better way.
Is there an accepted "best" thing to do in this instance? Should I, for example, do a low-pass FIR filter to squeeze out as close to 50Hz of signal bandwidth as possible? Or, is there some sort of optimal estimation scheme will do better?
The aim is to implement several channels (9 channels) on a smallish microcontroller (ARM Cortex M4, for example), so the computationally cheaper the better!