Convert .eim to .tflite

I am working with FOMO in Edge Impulse. I successfully deployed my model in the .eim format on a Raspberry Pi. Now, for other purposes, I would like to convert the .eim file to .tflite format on my Raspberry Pi. If anyone knows how to do this, please help. I searched in Edge Impulse to see if I could export it directly as a .tflite file, but I couldn’t find the TensorFlow Lite option.

Also, if anyone knows how to use the output from the .eim file without converting it to TensorFlow Lite (.tflite), I would be extremely grateful.

The Tensor Flow Lite model can be downloaded from the EI Studio Dashboard page of your project.

1 Like

The dashboard of my project does not provide the option to export the model in the TensorFlow Lite format.

I am not sure what is going on. My FOMO project shows:

1 Like

Thank you. I have now found this information on my homepage. However, I have another question because I’m a little bit confused. I initially thought that to download the model, I needed to go to the deployment page and choose the desired format for my model there. In my case, the .tflite format is not available as an option.So, my question is, do the suggestions in the screenshot indicate that I can download the entire model? I apologize if my question seems stupid, as I am new to working on this type of project.

the tflite file by itself is just the computation graph. to actually connect it to the rest of your product you need to do things such as 1) wrapping in an engine to run it and 2) handling the pre/post processing. these, plus a bunch more, are the extra pieces you get with a deployment.

mat

2 Likes

Thank you. Is the .lite file similar to the .tflite file?

the .lite file you can download is a tflite file.

mat

1 Like

How do I run mymodel.eim in a python script an print the output .

This code will run a FOMO EIM.

In general start with the Edge Impulse Linux SDK for Python.

1 Like

Hi Mat,

I’m in sort of a similar situation (started tinkering around with edge impulse for a project last week), and wanted to deploy my model in the .tflite format. I was wondering if you’d be able to expand on what extra things we might need to do to use the tflite file as opposed to perhaps a C++ library deployment? I understand we’ll need to handle the pre/post processing on our own and wrap it in an engine, anything else?

Thanks so much,
Namay

Hi @namayjindal

If you export to C++ and choose TensorFlow Lite / Unoptimized float32 that will get you the tflite model.

Best

Eoin