Question about Edge Impulse for Linux (.eim)

Question/Issue:

  1. Hi guys, I have a question about .eim model, I have read from this doc. But still don’t understand, can you please explain the easiest way what is .eim file is?
  2. is it same with another model format, such as .tflite or .lite?can we see what is the content inside the file?
  3. If I have run model on my machine, and save the model in .H5 or .tflite format, can we convert to .eim? how we convert it?
  4. I’m working on a sound classification project and running .eim using linux sdk. and it’s amazing, it can classify sounds in real-time in a short time.
    danger_alarm1
    my question is:
    What affects the speed of inference time? Is it windows size or frame size and FFT size?
    Do we record every detected signal?

regards,

Hello @dexvils,

  1. The .eim files are linux executables and they package everything you need to run your model (.eim stands for edge impulse model). It is compiled for your dedicated kernel architecture.
  2. No it’s not the exactly the same as it is already compiled. You cannot see directly what is inside but you can see the source code, see below:
  3. You can see the source code, modify it and build the .eim yourself: GitHub - edgeimpulse/example-standalone-inferencing-linux: Builds and runs an exported impulse locally (Linux)
  4. Thanks! Many things can have impacts on the inference speed, window size and FFT size have impacts but also your NN architecture. We try to output the inference time (and RAM/ROM usage) both in the DSP blocks and in the learning block so you can have an idea of the impact as you go. You can find also the on-device performances on the deployment page.

Best,

Louis

1 Like

hi @louis thanks for response.
I have tried to run with linux sdk and edge-impulse-linux-runner from terminal in raspberry pi, and why is the the inference time is so much different, twice or more, here is the snapshot:
from linux sdk:
from_edge2

from linux-runner
from_edge

why is so much different in time max 6ms vs 37ms?since I need to use in linux-sdk, can you help me?

regards,

@dexvils There should be no difference in speed, you can build your own EIM model from the downloaded SDK as @louis mentioned in the above link.

Either you downloaded an EIM model that has been compiled with the EON compiler, and the SDK is not downloaded with the EON compiler support, or you have added some code that is taking some execution time in between.

I would be really interested in seeing how you are calculating your inferencing time with the SDK.

Hi louis,how can i use my output .eim file on my aarch64 embeded system?Here is my system information and error log.Thanks~

[root@RK356X:/]# uname -a
[root@RK356X:/]#Linux RK356X 4.19.193 #1 SMP Fri 0ct 28 11:05:59 CST 2022 aarch64 GNU/Linux

Hi dexvils, how can i run .eim on my aarch64 embedded system?i tried a few times below.Do i need to cross-compile?

[root@RK356X:/]# uname -a
[root@RK356X:/]#Linux RK356X 4.19.193 #1 SMP Fri 0ct 28 11:05:59 CST 2022 aarch64 GNU/Linux

@y1165949971 EIM models are not executables in themselves.
You need to install the edge-impulse-linux-cli on your aarch64 device.

sudo apt install -y gcc g++ make build-essential nodejs sox gstreamer1.0-tools gstreamer1.0-plugins-good gstreamer1.0-plugins-base gstreamer1.0-plugins-base-apps

npm config set user root && sudo npm install edge-impulse-linux -g --unsafe-perm

Then you run it with this

edge-impulse-linux-runner --model-file yourfile.eim

Thanks for reply.Sorry, Maybe i need to install Ubuntu?But I want to package our Edge Impulse with other C++ demos, do I need to use the C++ SDK and cross-compile ?How could I run it with a simple binary file?

Okay, I see. What kind of embedded hardware you are working on? do you have a package manager?

Hi,Omar.Thanks for reply.
Maybe i don’t have a package manager right now.All my package is got by cross-compile.Could you please give me some advises to run our Edge Impulse demo easily on my embeded hardware.Here is my embeded hardware:
image

[root@RK356X:/]# uname -a
[root@RK356X:/]#Linux RK356X 4.19.193 #1 SMP Fri 0ct 28 11:05:59 CST 2022 aarch64 GNU/Linux

This is a Linux device, it should be supported to output the box as long as the sensor connection is detected.
Either you use the CLI tooling, as described above, or you choose to use the SDK. If you decided to go with the SDK you need to compile the source code on the device, therefore you will need to install the compiler toolchain locally using your package manager. If this is impossible, you will need to cross-compile it on the host machine and then transfer it to the rockship target.

If you need more info, have a look at our docs for the raspberry pi, it should similarly applicable to you,