Several Edge Impulse Applications on one Target

Hi Edge Impulse Team,

I created several models using Edge Impulse and I intend to develop them on the same target (Embedded Linux one). The automatically created C++ libraries have quite some overlap. What are the common parts? and is it possible to wrap them for example in a common .so and link against them?

Regards

Lukas

Hi @Lukas, for embedded Linux you can get .eim files (which encompass everything about the model w/ full hardware acceleration) through the Linux CLI, the Studio, or by building yourself (see https://docs.edgeimpulse.com/docs/edge-impulse-for-linux). You can then use multiple of these in any application (see the Python, Node and Go examples at the bottom).

For deep-embedded systems we’re currently still working on getting multiple models running on same device, by making the SDK aware of multiple potential models.

Hi @janjongboom thanks for your reply. I saw that already, but is there a way to use those .eim files with C++ then (sorry did not find anything) like say I have this .eim file and I write a C++ application that wants to use it?

@Lukas ah good point. So sort of. We don’t have an example for it, but can probably put something together. How it works is:

  1. You launch the .eim file (it’s an executable) via: ./name-of-model.eim /tmp/model.sock
  2. You connect from your application to the socket.
  3. You send { "id": 1, "hello": 1 } (JSON stringified) over the socket and get model info back.
  4. Then to classify you send { "id": 2, "classify": [ 1, 2, 3, 4 ] } (JSON stringified again).
  5. You get the result back.

That’s it. E.g. here’s how we do it in Python: https://github.com/edgeimpulse/linux-sdk-python/blob/master/edge_impulse_linux/runner.py

Will try and write up a quick example this morning, would be useful for many others.

3 Likes

Here’s a very quick (and dirty :slight_smile: ) demo: https://github.com/edgeimpulse/example-standalone-inferencing-linux/tree/classify-eim - build with:

rm -f source/*.o && \
    APP_CLASSIFY_EIM=1 make -j8 && \
    ./build/classify-eim ~/repos/models/jan-vs-niet-jan.eim features.txt

Any update on this? I’m keen to use multiple models on an ESP32 project.

Hey, still WIP, the first PRs have landed now to separate things out, but it’s not there yet. Unfortunately no ETA.

Hey, still WIP, the first PRs have landed now to separate things out, but it’s not there yet. Unfortunately no ETA.

Could you please tell Is there any new update regarding running multiple models on one device? I want to run the two different models on one mcu target.

Hello @ujjwalrathod00700,

Unfortunately, we don’t have any updates yet.

Best,

Louis

1 Like