The purpose of this project is to make it easier to experiment with TinyML.
At the moment we support ESP32 and RP2040. We have a STM32 port but there are some issues with it.
On ESP32 we have hello-word, micro-speech and person-detection running. For RP2040 we have hello-world and micro-speech. STM32 did run hello-world but its not working at the moment due to being too big.
To start with we are implementing all of the tlfm reference examples but I want to have other examples in the future and encouraging users of Edge Impulse to run them on our firmware seems a natural way to get new examples.
I´m researching how we can wrap the edge impulse sdk but at the moment what does work is running tensorflow lite for microcontrollers models in micropython.
Similar to OpenMV you can drop off the model.tflite file into the flash and then run the model on device. On ESP32 you can use PSRAM for larger models like the 300 kb person detection example.
I wonder if the Edge impulse documentation could be updated to list us as a runtime for tflm models.
Hi @michael.o - thanks for the initiative. My biggest issue with this is that models (esp. non-vision models) are more than just the TensorFlow Lite model. E.g. preprocessing code, DSP code, non-neural networks and post-processing of the results. What we bundle with models deployed through Edge Impulse is all of that. Single interface (ei_run_classifier) to pass data in => conclusions out. For simple image models the approach of dragging-dropping the tflite model will work but this falls apart for object detection models (coming to OpenMV soon as well).
So in my opinion a better way forward would be:
Wrap around our normal inferencing SDK (so via the ei_run_classifier / ei_run_classifier_continuous functions).
Tell users to export as C++ library from Edge Impulse, and drop it into your project (note that SDK + model-parameters + tflite-model folders are tightly coupled, you need to update all of them), and then rebuild and flash.
Use that from MicroPython.
I know this is more annoying as you lose the drag&drop, but it’s the same pattern we’re now going to follow w/ OpenMV to support both the non-plus H7 better with more models, and you get all the DSP / normalization / post-processing for free. If that’s interesting, would be happy to add your project as community ports.
Note that we’re working on RP2040 and ESP32 official support (@AIWintermuteAI is doing this at the moment).
I will look at wrapping the edge impulse sdk in micropython. Do the blocks come with the exported c++ on with each example or is there a way to bake them all in (or select which ones to bake in) to the firmware.
Micropython has a way to compile C and I think C++ codes into .mpy files that can be placed into an existing firmware at runtime. I was thinking of trying to use this so that the C++ parts in the export would be compiled into a native module which could still be drag and dropped onto the firmware.
I was looking at this approach at the start before I realized that an esp32 with 4MB flash was enough to fit all of the tflm operators.
There is a fixed function table that these modules can access but since I would be making the firmware anyways I think it should be possible to expose the right methods.
AIWintermuteAI will be programming the ESP32 with compiled code, not micropython.
Using any other “feature provider” won’t work, b/c the model trained in Edge Impulse is trained against features generated from our versions of common DSP preprocessing blocks. (like MFE in your shown in the github issues link). @michael.o have you had any luck compiling our SDK into an .mpy file?
ESP IDF is just Espressif’s official IoT Development Framework, the code you compile with the help of ESP IDF is still C/C++ code…
edge-impulse-sdk can be compiled with ESP IDF, it just requires a little bit of fiddling with CMakeLists.
Once we finish integrating ESP32 as officially supported board, the code for firmware/standalone inferencing example will be made public.
Not sure if that will be helpful in your task - from what I understand your goal is to either integrate the whole edge-impulse-sdk into mpython firmware or at the very least DSP part of it (to make the firmware occupy less resources).
We’re still targeting mid-April for ESP32 firmware/integration launch
It will use ESP-NN optimized kernels.
The error has to do with CMSIS - it is not really supposed to be used at all on non-ARM platforms. You can just delete CMSIS folder from edge impulse sdk, when compiling for ESP32.
I’am also very interested in deploying EdgeImpulse on ESP32-S3. Just designing a device capable of analyzing sound samples (from a mic) for monitoring health of an electrical engine (for predictive maintenance). Would be happy to beta-test.
There are no DSP optimizations included in the firmware. So ESP32-S3 should have the same performance as ESP32. Perhaps you didn’t set clock to 240 Mhz? Check the parameters in sdkconfig for optimized clock settings.