Hi,
I’m doing a Gesture detection project in nRF52833 board with Segger embedded studio. I’ve connected the device to Edge Impulse through the data forwarder and trained a model. Now I’m integrating the C++ library into my project. I’ve forwarded the data to the features array but I’m getting the one gesture as always high.
This is the log that I’m getting. I don’t know what value that feature is printing and from where it is printing, but the values are always the same, and I’m getting a prediction always high.
How can I solve this? Thanks in advance.
This is the function that runs the classifier. And I gave if (result.classification[0].value > 1) because it always gives the value 0.99 and I want to execute the else part.
This function is called every time when the controller fetches the sensor data.
I am not sure as I don’t have the full code but maybe you want to have a look at this to get a better understanding of how we do with IMU data (at least for the logic): Data forwarder - Edge Impulse Documentation
Thanks for the reply. I’m a bit confused about passing the data to the features array. So I thought of sending the raw feature from the data collected in Edge Impulse to the Features array according to this article.
#include "edge-impulse-sdk/dsp/numpy_types.h"
#include "edge-impulse-sdk/porting/ei_classifier_porting.h"
#include "edge-impulse-sdk/classifier/ei_classifier_types.h"
static char features[FEATURES_SIZE];
int get_feature_data(size_t offset, size_t length, float *out_ptr) {
memcpy(out_ptr, features + offset, length * sizeof(float));
return 0;
}
EI_IMPULSE_ERROR run_classifier(signal_t *, ei_impulse_result_t *, bool);
void process_input(uint16_t length)
{
static const int features[] = {-12, 22, -23, -11, 23, -24, -10, 23, -24, .....};
signal_t signal;
signal.total_length = EI_CLASSIFIER_DSP_INPUT_FRAME_SIZE;
signal.get_data = &get_feature_data;
ei_impulse_result_t result;
// Calculate the length of the buffer
size_t buf_len = sizeof(features) / sizeof(features[0]);
// Make sure that the length of the buffer matches expected input length
if (buf_len != EI_CLASSIFIER_DSP_INPUT_FRAME_SIZE) {
ei_printf("ERROR: The size of the input buffer is not correct.\r\n");
ei_printf("Expected %d items, but got %d\r\n",
EI_CLASSIFIER_DSP_INPUT_FRAME_SIZE,
(int)buf_len);
return 1;
}
EI_IMPULSE_ERROR res = run_classifier(&signal, &result, true);
if (result.classification[0].value > 1)
{
nrf_gpio_pin_set(LED2);
nrf_gpio_pin_clear(LED1);
NRF_LOG_INFO("Left-Right");
}
else if (result.classification[1].value > 0.9)
{
nrf_gpio_pin_set(LED1);
nrf_gpio_pin_clear(LED2);
NRF_LOG_INFO("Rotate");
}
else
{
NRF_LOG_INFO("Nothing");
NRF_LOG_INFO("data received: %d",buf_len);
}
}
This is the integration that I’ve made in the main.c. I call the process_input() function in a function where I fetch the sensor data. But still I’m getting the following log.
Where I gave the raw feature data of rotate in the feature array but it still shows left_right as high. How can I solve this problem? Am I feeding the data correctly in the features array?
I can see that the features buffer is declared twice:
static char features[FEATURES_SIZE];
and
static const int features[] = {...}
I am suspecting that the get_feature_data function uses the global variable and not the one scoped at your function level. Can you remove the declaration and only use:
In the 1st code listing you are updating the features array via features[2] = sensor_values. I am not sure what this is doing exactly but at least you are updating the features array.
In the 2nd code listing features is never getting updated.
When the line containing run_classifier() is executed it will call get_feature_data() that copies features in to a buffer that the Edge Impulse Inferencer uses to make predictions.
Maybe an easier to understand code listing is in nano_ble33_sense_accelerometer.ino. To view this file deploy an Arduino library and browse to the example folder. The INO file won’t run directly on your hardware but may give you guidance in how to code it.
Now the Features array is not constant as before, if I move the device I think it is predicting something but not as intended. I have few questions,
Is the features array fed right? If so why the prediction, DSP, inference and anomaly time is 0 ms?
The gesture Left-Right and Rotate which I have trained is just a momentary movement, Most of the time the device will be doing some other movement. If that’s the case do I have to add the anomaly block? Say if I’m getting
Left-right: 0.15
Rotate: 0.84
do I’ll get
Left-right: 0.00
Rotate: 0.00
if some other actions are made?
I want the device to predict the gesture instantly when it is performed. How to reduce the prediction time? Do I have to reduce the interval time or I have to use the run_classifier_continuous() function?
Did you use this as well to print the values over serial with the data forwarder?
If so, can you make sure the movements you’re trying to classify are somehow similar to the way you collected them.
If not, I’d suggest to collect the data the same way as when you used the data forwarder.