Hi, I fine-tuned a object detection model with edge impulse, but since I use seeeduino XIAO esp32s3 sense board, deploying the model to the board is not supported, thus I am thinking about running the model instead on my laptop with python. I tried to load the exported lite model into my python script, but got this error: zsh: segmentation fault python livestream_video.py. Does anyone know how to solve this issue? Below is the code I used to load the model. Thank you!
import tflite_runtime.interpreter as tflite
// Load the TensorFlow Lite model
interpreter = tflite.Interpreter(model_path=‘ei-fabric-object-detection-tensorflow-lite-int8-quantized-model.lite’)
While the ESP32S3 is not an officially supported board, you can still deploy the impulse as an Arduino library or C++ library. See this example for how to use the C++ library in ESP-IDF.
Thank you for your response! Sure, here is the details and hope that are useful to help me debug. I downloaded the trained model from dashboard on edge impulse website as shown in the figure below, specifically, I downloaded the tensorflow lite (float32) and (int8 quantized), but the suffix of the downloaded file is .lite, rather than .tflite, which is one detail that I noticed, not sure if this caused the problem.
The code I used to import the model is simply as below to help me check the problem, but got no clue yet. I also checked the version of the tflite_runtime package I got, it is tflite-runtime-2.5.0. Let me know if anything else would be helpful for debugging! Thank you!