Question/Issue: My data is in numpy array format and has dimension of (960000x1024x2). I am using the below code to upload the data to the edge impulse:
assert len(response.fails) == 0, “Could not upload some files”
Save the sample IDs, as we will need these to retrieve file information and delete samples
ids = []
for sample in response.successes:
ids.append(sample.sample_id)
lbl_train and snr_train have dimension (960000x1). I am not sure if the way I included snr_train in meta data is correct. It shows the snr value of each signal in X_train.
while running the code, it gives error: TypeError: Object of type ndarray is not JSON serializable
You’re seeing that error because the metadata cannot be expanded to all samples. It must be a dictionary that applies to all samples being uploaded in that one call (see: data package - Python SDK1.0.9). Your labels should also be a list of strings (rather than a Numpy array).
Thanks, @shawn_edgeimpulse. I converted the labels to a list of strings and it works. However, when I include the metadata, it gives an error: “connection pull is full. discarding connection.” The metadata is defined as metadata = {‘snr’: snrs}, where ‘snrs’ is a list of integers and has some number of elements as in labels or in data. am I doing anything wrong?
Also, while computing features, I am getting following error “Created window size should be under 4096 MiB but was 122071MiB. set a larger window increase to reduce the number of windows.”
how to set a larger window? I counldn’t locate the option in edge impulse?
The metadata can’t be a list, it must be a single label/value for the entire set you are uploading. If you want the SNR metadata to be different for each sample, then you’ll need to construct the Sample objects yourself and upload them using upload_samples(). With that, you can have different metadata labels/values per sample.
My guess for the “created window” size error is that your sampling rate is too high. Could you try setting sampe_rate_ms to 1 or 1000, upload the data again, and see if that fixes the error?