Decimation / Downsampling

We took some accelerometer data at 75 Hz with a different device than we will use at 50 Hz or slower. So I think the data should be decimated, then retrained. But I read here that decimation is only available in enterprise. That is too bad, because it seems like a very basic feature anyone would want to try just to reduce the data and A/B test it to understand if ML would be effective at lower sample rates without collecting the data all over again.

Hi @Panometric,

Thank you for the feature request! Funnily enough, our engineering team is actually working on implementing a downsampling block for time series data into the Studio as we speak. I will update this post when this feature gets pushed into production.



This is now released @Panometric: