Locally deployed model's bounding boxes (Python) doesn't match Model Testing results

hi Edge Impulse!

I’ve trained up an object detection model (FOMO MobileNetV2 0.35). I downloaded the model with
edge-impulse-linux-runner --download modelfile.eim to my x86 Linux machine, and I can process still images with python classify-image.py modelfile.eim image.jpg.

Given exactly the same image file, classify-image.py’s bounding boxes are different from the studio’s Model Testing results. The coordinates of the bounding boxes don’t match, and sometimes the number of bounding boxes don’t match as well.

classify-image.py writes the processed image, so I can tell that the preprocessing is working (grayscale, 600x600). I’ve also seen the related posts concerning int8/float32, e.g.

In the studio, I found the Deployment > C++ Library > Optimizations dialog. I selected Unoptimized (float32) , disabled the EON Compiler, and rebuilt – but it’s not clear to me how to propagate these changes to the downloaded modelfile.eim.

So my questions are:

  1. why don’t the results match?

  2. is the Python SDK a feasible route for deploying object detection for image files or should I stick with C++? I don’t need to process a camera feed; I thought testing with image files would be easier.

  3. is there any way with the Python SDK to determine if the model is using int8 or float32?


Project ID: