How to use Edge Impulse model in a Python Program

DESCRIPTION:

Hello everyone, I am quite new to ML and Edge Impulse. I am trying to build a project to classify different kinds of vibration signals using Raspberry Pi 3b. For this, I collected vibration data using my sensor and trained a model using edge impulse and downloaded it as a C++ library. Now I want to use that library inside my python program, the one that is collecting the data.

EXPLANATION:

The python program that I am using works as follows:

When a button is pressed, transmitter starts transmitting vibration and receiver starts receiving those vibrations and saves them in the form of a wav file.

From here what I want to do is to classify the vibration response saved in the wav file. This means once the wav file is saved I want to load it again into my program and classify it using trained edge impulse model.

Another reason for using the edge impulse model inside the python program rather than in the terminal is that I am using the program to display the steps (data collection, file saving, model prediction, result of prediction) that are going on, on an LCD, which I have programmed using Python.

PROBLEM:

Being new to this I have no concept of using an Impulse inside a python program. I have seen how to use it in the terminal but this will (most probably) not work in my case. Kindly guide me through this.

P.S:

The method may seem inefficient but I want to go by this method. Any help is appreciated.****

I asked similar earlier this year and thanks to @shawn_edgeimpulse I was able to get it working. See this post. So in short, I’ve downloaded a .tflite-model and plugged it in the code like the post describes.
Not sure if this fits your use case completely but perhaps of some help.

2 Likes

Thanks for your response!

I am an absolute noob at this, so some of the things are difficult to grasp, but I think I have got something to begin with, thanks to your post.

Will update you on the progress

I was also a noob with using .tflite-models in Python earlier this year, but eventually got a grasp of it.

I’ve written this tutorial where I’m using an EEG-headset and Python to play a simple Pong-game, the Python-code is here. Feel free to take a look at it and copy & paste whatever you want.

By the way, interesting use case you have!

I think I am finally there, just a little confusion.

In your EEG Data Example, you have trained the model on raw features. Contrary to that, I have trained my model on processed features acquired using MFCC.

So my question is that while using set_tensor() function during inferencing should I pass raw data array collected from my sensor as set_tensor() function’s value argument or should I:

  1. First generate MFCC from my raw data array.
  2. Extract processed features from it.
  3. Convert processed features to numpy array.
  4. And pass that array to the value argument of set_tensor() function.

If latter is true, kindly guide me through its 1st and 2nd steps.

This is something I haven’t done myself, and thus can’t answer, but perhaps e.g. @louis or someone else from the team can chime in.

Hello @Valeed,

What kind of machine are you thinking of using to run your main program?

I can see several other approaches:

  • Use the Linux Python SDK directly if you have something like a Jetson Nano/RasperryPi/Ubuntu x86 machines
  • Compile and build example-standalone-inferencing-cmake of your model and encapsulate the run (os.system("./app features.txt") into a python script, grab the results of the inferencing and do something with it.

Best,

Louis

@louis

Thanks for your response.

I am using a Raspberry Pi 3b for my project.
I wanted to go by the approach that I have mentioned as it seems relatively simpler to me than the other approaches, due to my lack of knowledge and experience in this field.

I think that I have found the solution based on:

When we download the edge impulse library, it will contain not only our model, but the code necessary to perform feature extraction as well.

– Introduction to Embedded Machine Learning Course by Shawn Hymel, Coursera

also:

The edge impulse library expects 375 raw values for this particular project we’re working on.

– Introduction to Embedded Machine Learning Course by Shawn Hymel, Coursera

From the above statements I infer that I can use raw data array as value argument to the set_tensor() function while inferencing, even if the model is trained using processed features from MFCC on edge impulse.

Am I right?

Unfortunately, at the time of inferencing we have to pass processed features into the set_tensor function.

Need guidance on how to generate processed features (features extracted from MFCC) from raw features (features collected by sensor) using Python code (instead of copying the processed features from edge impulse studio).

image

@ThomasVikstrom , @louis

Still unable to find a solution.
Kindly guide me or refer someone who you think can guide.

@Valeed GitHub - astorfi/speechpy: SpeechPy - A Library for Speech Processing and Recognition: http://speechpy.readthedocs.io/en/latest/ Maybe it will help you.
@louis
I am also looking for the feature extraction algorithm based on pure python(not .eim files& Linux SDK for Python).
Because I want to integrate the edge impulse feature extraction algorithm or the entire c++sdk into microPython. Like dynamically load feature extraction and TF operations by deployment metadata. json.
I noticed that the C++SDK of edge impulse includes speechpy. By modifying several parameters, the mfcc features generated by speechpy are very close to those generated by edge impulse platform.
My core concern is whether edge impulse can open-source the entire pure python feature extraction algorithms, so we don’t need to integrate.

Hello @duzhipeng,

Sure, you can find the python implementation here:

Best,

Louis

2 Likes

Thanks. I will try it