ESP32 with Static Buffer

I didn’t want to Hijack anyone elses thread so I started my own to get help.

I built a new circuit on my breadboard with an IMU, ESP32 and LCD. I’m getting data from the accelerometer but I have not yet fed it into buffer. I currently have the buffer filled with the information from the copied raw features from my live classification.

At this point all I get from the serial monitor on the IDE is below.

Edge Impulse standalone inferencing (Arduino)
run_classifier returned: 0
Predictions (DSP: 17 ms., Classification: 0 ms., Anomaly: 1 ms.):
[0.99609, 0.00000, -0.204]

Is the static buffer example used for continuous motion recognition? Should it show as per the documentation as shown below.

Edge Impulse standalone inferencing (Arduino)
Running neural network…
Predictions (time: 0 ms.):
idle: 0.015319
snake: 0.000444
updown: 0.006182
wave: 0.978056
Anomaly score (time: 0 ms.): 0.133557
run_classifier_returned: 0
[0.01532, 0.00044, 0.00618, 0.97806, 0.134]

Also, lets say I want to switch to something totally different such as speech or images. I would deploy my impulse as an arduino library. Could I just simply copy over some of the files in the static buffer example to get to work?

Looking over the documentation, I’m not clear on how to use the libraries and framework to get a custom impulse working on the ESP32.

Would the “Input to the run_classifier function” documentation that you have tell me how to run my custom impulse? I just found it and have yet to read it.

Thanks for all your help.

Hi @electronictechgreg, change false /* debug */ to true /* debug */ to show information on the classes (the one you expect). But static buffer can be used for any impulse, vibration, audio, images, etc. Just add the new Arduino library and select the example from the Arduino examples menu and it should load the new one automatically.

Great, I will try as you suggested tonight.

Thank you very much.

I just noticed. Did I post this under blog? Didn’t mean to do that. Is there a way to change that?

No worries I moved the post :wink:

Great thank you.

Quick question.

If I wanted to make my own impulse is there a way to merge it with the static buffer example? Starting from scratch I wouldn’t know what to call. I did find 2 great examples on circuit digest that I will study tonight.

After doing some reading I figured out how everything works together. It was simpler than I thought.

Thank you both for your help.

I noticed you made changes to fmax for arduino in the max.h file. I still get compolition errors for fmin and noticed that “defined(ARDUINO_ARCH_ESP32)” is missing from the min.h file.

C:\Users\Greg\Documents\Arduino\libraries\ei-fan-anomoly-arduino-1.0.1\src/edge-impulse-sdk/tensorflow/lite/kernels/internal/min.h:29:10: error: ‘fmin’ is not a member of 'std

Hi @electronicgreg, thanks for noticing that! Will make sure this is picked up in the next patch.

Thank you. A wierd error I received on my second project that I did not receive on my first project is below. I just commented out the three lines of code since I didn’t need them in my project.

C:/Users/Greg/Documents/Arduino/libraries/ei-fan-anomoly-arduino-1.0.1/src/edge-impulse-sdk/classifier/ei_run_classifier.h:882:10: error: ‘EI_CLASSIFIER_TFLITE_INPUT_QUANTIZED’ was not declared in this scope

I commented these lines of code out and it compiled fine afterwards.

// Check if we have a quantized NN, and one DSP block which operates on images...
if (!EI_CLASSIFIER_TFLITE_INPUT_QUANTIZED) {
    return EI_IMPULSE_ONLY_SUPPORTED_FOR_IMAGES;
}

@electronictechgreg Thanks for reporting that as well. Looks like a regression in our exported code for models that only have an anomaly detection block and no neural networks, and I just realize we don’t have any integration tests for that. Will make sure this is patched in the next patch release as well, and that we’ll add it to the tests!

You’re right my second project only has a an anomly detecton block and no neural network. Everything is working fantastically otherwise.

All of this is now fixed and deployed. Just re-export your project and you’ll have the latest fixes.

Great, thank you soo much.

Can we keep talking about the Static Buffer. I feel that it is the bottle neck to completely understanding Edge Impulse. I would like to see code how people take the Static Buffer and use it with the Microphone, motion sensor, camera and other sensors. Anyone care to share their code or list a github with examples. I don’t need lots of explanations, I can do a lot with working code.

@Rocksetta. I used the following websites to get a better idea how to use the static buffer.

http://www.splinter.com.au/2020/04/16/esp32-baby-monitor/

2 Likes

@Rocksetta, to get an idea of the static buffer format see https://docs.edgeimpulse.com/docs/running-your-impulse-locally-1#input-to-the-run_classifier-function

And you can always get one exact frame of data that you can feed into the static buffer from here:

image

Click the copy icon.

1 Like