Problems with export accelerometer data

Continuing the discussion from Export your datasets:

Hi,
I need to export a dataset from Edge Impulse with data obtained from an accelerometer.
When dumping this information and exporting it in json format, I have verified that the “payload” records do not all have the same length. That is, the dataset that I need to rebuild contains information that is not consistent.
Let me explain: I take data samples during a period of time of 2 seconds from Edge Impulse. When exporting these samples, the json files contain information that, a priori, due to their different sizes, does not seem consistent.
Which may be due?
Thanks in advance.

Jordi.

Hi @jpozoc,

Could you provide a project ID that we can look at so we can try to replicate the problem?

Hi Shawn

this is the ID of my project: 108222
The project is to collect gestures through an IMU sensor connected to a raspberry pi pico.
Most of the samples (in json format) have the payload content with 130 elements but some have 131 elements. This is normal?
I need to transform it into a dataset to use in my own python code to test various classifier models.

Thanks in advance.

Jordi.

Hi @jpozoc,

How are you collecting the dataset? If you are using the Edge Impulse CLI or Data Forwarder, then there is a chance that slight variations in the collection length may happen (as the collection is controlled remotely from the Studio). In the Studio, the “Window size” takes care of this by truncating any samples that are too long.

So, if you download your original data, some variation is expected. If you are trying to make all of your samples the same size, you might need to write a quick program to curate the data (i.e. makes all samples the same length) for you (e.g. in Python or on Google Colab).

1 Like

Hi again Shawn,

thanks for the reply. I collect the data using the Data Forwarder. I take your solution and write a python routine.

Another question by the way: In the feature extraction screen (spectral features), is there a way to massively download all processed features from all samples?

Thanks again!

Jordi

Hi @jpozoc,

If you go to the dashboard on your project, you can download “Raw data training data” and “Raw data training labels” in Numpy format. These contain the processed features (output of the processing/DSP block) of your training data. There’s also an option to get the processed features of the test set as well.