Unable to load fine-tuned Edge Impulse TF lite model with Python

Hi, I fine-tuned a object detection model with edge impulse, but since I use seeeduino XIAO esp32s3 sense board, deploying the model to the board is not supported, thus I am thinking about running the model instead on my laptop with python. I tried to load the exported lite model into my python script, but got this error: zsh: segmentation fault python livestream_video.py. Does anyone know how to solve this issue? Below is the code I used to load the model. Thank you!

import tflite_runtime.interpreter as tflite

// Load the TensorFlow Lite model
interpreter = tflite.Interpreter(model_path=‘ei-fabric-object-detection-tensorflow-lite-int8-quantized-model.lite’)


Hi @Tongyan,

While the ESP32S3 is not an officially supported board, you can still deploy the impulse as an Arduino library or C++ library. See this example for how to use the C++ library in ESP-IDF.

Without seeing your full code, it’s difficult to determine why the segfault occurred. Can you offer any specifics? Do other models run correctly? Not sure if it helps, but here is my code that runs a TFLite object detection model (trained in Google MediaPipe): https://github.com/ShawnHymel/google-coral-micro-object-detection/blob/master/notebooks/tflite-runtime-test-object-detection.ipynb

Hi Shawn,

Thank you for your response! Sure, here is the details and hope that are useful to help me debug. I downloaded the trained model from dashboard on edge impulse website as shown in the figure below, specifically, I downloaded the tensorflow lite (float32) and (int8 quantized), but the suffix of the downloaded file is .lite, rather than .tflite, which is one detail that I noticed, not sure if this caused the problem.
The code I used to import the model is simply as below to help me check the problem, but got no clue yet. I also checked the version of the tflite_runtime package I got, it is tflite-runtime-2.5.0. Let me know if anything else would be helpful for debugging! Thank you!

import tflite_runtime.interpreter as tflite

interpreter = tflite.Interpreter(model_path=‘/Users/tywang/Downloads/ei-tufting_cv-object-detection-tensorflow-lite-int8-quantized-model.lite’)


input_details = interpreter.get_input_details()

output_details = interpreter.get_output_details()