Arduino Output Based on Classification

Hello everyone, I’m stuck in post deployment. Once the model is deployed onto an Arduino, what can I add to the code so it can react to the classification? So in the case of the gesture recognition example, what would you add so once it identifies “wave” the Arduino’s led turns on?

I know how broad of a question this is, but I haven’t seen much documentation on the subject. An example code would be excellent! I’m sure this would help others who are new to the subject.

Thank you for all of the help,
Mason

1 Like

Found a solution but would love to know of others if anyone knows of any. I’ll be referencing this site: https://www.digikey.com/en/maker/projects/how-to-use-embedded-machine-learning-to-do-speech-recognition-on-arduino/1d5dd38c05d9494180d5e5b7b657804d

A sample code is below. I put this at the end of the inference loop before print the predictions. It should go anywhere in that loop before the delay in my opinion. In this case it will print onto the serial monitor “Target” if the target label is classified with 70% confidence otherwise write “Another Value”. Using normal Arduino coding methods this could be something such as an analog write.

All you need to do after this point is change the “x” in the result.classification[x] to the classification index. This can be found by opening the original zip file, open “src/model-parameters/model_metadata.h” file. Scroll down, and you should see a string array named “ei_classifier_inferencing_categories”. This array shows you the exact order of the labels. Count the number in which your index is at (starting at 0) and put that in as x.

Sample code:
// Serial.write if a specific label value is above a threshold
if (result.classification[x].value > 0.7) {
Serial.write(“Target”);
} else {
Serial.write(“Another Value”);
}

There’s an amazing example by @Rocksetta, printing info based on the classification detected -

So, you can use snippets such as -

        if (result.classification[ix].label == "space" && result.classification[5].value < 0.2){ // only if erase less than 50%
             // Serial.println("Adding a space: "+ String(result.classification[ix].value*100)+"%, ix: "+ ix);
              Serial.println("Adding a space: "+ String(result.classification[ix].value*100)+"%, ");
        }        

You can compare his example with the Arduino Library to see what’s changed, and use such snippets to trigger classification based on that.

Thanks, Let me know if this helps.

2 Likes

Hi Mason. That was also the biggest problem for my students last semester. I have been trying to think of ways to simplify post-deployment, to keep the code as simple and “Arduino/Tensorflow” looking as possible, but I keep getting distracted by other pressing issues and projects, ( my day job for one :slight_smile: ). I will keep thinking about it and hopefully get down to some code experimenting soon.

@masonak In addition to the answers by @dhruvsheth and @Rocksetta the example sketches are your best option (under File > Examples):

Each one of these has a place where the results are printed, e.g.:

    if (++print_results >= (EI_CLASSIFIER_SLICES_PER_MODEL_WINDOW)) {
        // print the predictions
        ei_printf("Predictions ");
        ei_printf("(DSP: %d ms., Classification: %d ms., Anomaly: %d ms.)",
            result.timing.dsp, result.timing.classification, result.timing.anomaly);
        ei_printf(": \n");

Here you can easily check whether a certain class is seen and f.e. toggle an LED, drive a servo, or something else:

    for (size_t ix = 0; ix < EI_CLASSIFIER_LABEL_COUNT; ix++) {
      if (strcmp(result.classification[ix].label, "on") == 0 && result.classification[ix].value >= 0.7) {
        digitalWrite(...);
      }
    }

For custom sensors there’s examples here: https://docs.edgeimpulse.com/docs/cli-data-forwarder