How to include/expose the system label "uncertain" in exported Arduino library?

Hello community,

I’m doing a simple audio classification test on Arduino Nano 33 BLE sense.
When I do model testing in the edge Impulse Studio, I do get some labels uncertain when testing and is very good I think as it provides clarity.
Screenshot 2024-03-29 at 13.17.34

After exporting the Arduino library, I want to have the feature to display uncertain label along side the predefined labels.

In the <lib>/src/model-parameters/model_variables.h I could see the below:

#include <stdint.h>
#include "model_metadata.h"
#include "tflite-model/tflite_learn_7_compiled.h"
#include "edge-impulse-sdk/classifier/ei_model_types.h"
#include "edge-impulse-sdk/classifier/inferencing_engines/engines.h"

const char* ei_classifier_inferencing_categories[] = { "myLabel1", "myLabel2", "myLabel3", "myLabel4"};


  1. Is it possible to enable/expose the uncertain?
  2. If yes, how? Something in some header files I could do or I have to edit something in studio to enable the uncertain’s export at the first place

Hi @dattasaurabh82

Not quite sure if you can but you can try by adding it to the model variables and see start by reviewing our ble 33 complete firmware to get an idea of the structure. Let us know how you get on - firmware-arduino-nano-33-ble-sense/src/model-parameters/model_variables.h at master · edgeimpulse/firmware-arduino-nano-33-ble-sense · GitHub




Okay I did a bit of digging …

What did I do?

  1. Edit <lib>/src/model-parameters/model_variables.h and add the extra uncertain category to it
const char* ei_classifier_inferencing_categories[] = { "ambient", "discomfort", "hungry", "sick", "tired", "uncertain" };
  1. Edit <lib>/src/model-parameters/model_metadata.h and increase the value of EI_CLASSIFIER_LABEL_COUNT from previously 5 to now 6 as we added the extra category uncertain in const char* ei_classifier_inferencing_categories[] (model_variables.h).

The numbers of labels incremented for sure, when I use this debug part in my main sketch when I check by the below print statements:

  ei_printf("Inferencing settings:\n");
  ei_printf("\tInterval: %.2f ms.\n", (float)EI_CLASSIFIER_INTERVAL_MS);
  ei_printf("\tFrame size: %d\n", EI_CLASSIFIER_DSP_INPUT_FRAME_SIZE);
  ei_printf("\tSample length: %d ms.\n", EI_CLASSIFIER_RAW_SAMPLE_COUNT / 16);
  ei_printf("\tNo. of classes: %d\n", sizeof(ei_classifier_inferencing_categories) /
  ei_printf("\tClassifier label count: %d\n", EI_CLASSIFIER_LABEL_COUNT);
  ei_printf("sizeof(ei_classifier_inferencing_categories): %d\n", sizeof(ei_classifier_inferencing_categories));
  ei_printf("sizeof(ei_classifier_inferencing_categories[0]): %d\n", sizeof(ei_classifier_inferencing_categories)[0]);

Resulting serial console output:

Inferencing settings:
	Interval: 0.06 ms.
	Frame size: 32000
	Sample length: 2000 ms.
	No. of classes: 6
	Classifier label count: 6
sizeof(ei_classifier_inferencing_categories): 24
sizeof(ei_classifier_inferencing_categories[0]): 4

Please pay attention to

  1. No. of classes:
  2. Classifier label count: (called from model_metadata.h)

But on running classification one of the labels is null and not uncertain.

The part, from my main sketch, that prints the classification results:

for (size_t ix = 0; ix < EI_CLASSIFIER_LABEL_COUNT; ix++) {
    ei_printf("    %s: %.5f\n", result.classification[ix].label, result.classification[ix].value);

And, that results to:

    ambient: 0.00000
    discomfort: 0.01172
    hungry: 0.98438
    sick: 0.00000
    tired: 0.00391
    (null): 0.00000

As you can see, uncertain is not picked up from ei_classifier_inferencing_categories[]

Not sure but I believe that ei_impulse_result_t needs to be told that uncertain exists but not sure if it is ported at the first place from edge impulse or where to look for it .

Any guidance here or am I shooting in the dark, anyone, @rjames ?

(Thanks in advance)


This is not going to work the in the way you are doing this. That is because the model has only been trained on the labels you provided, which didn’t include the ‘uncertain’ class. So as a result you’ll be reading incorrect memory.
If you want the model to return/classify ‘uncertain’ you’ll have to introduce a ‘uncertain’ class (or any similar name) to represent that fallback class when no other is detected. Provide training and test data with such class and then retrain your model.

Or another approach which does not involve retraining is to infer the ‘uncertain’ class. This is how Studio is determining/inferring the classification as ‘uncertain’. That is, you iterate over all classification results and if for example if no result is >= 0.55 then determine the entire classification as ‘uncertain’.

// Raul


Got it.
Thank you @rjames for the clarification.
I will go with the 2nd approach as that was kind what I was thinking as a fallback and good to know that that’s how the studio handles it too.

1 Like