FOMO model and heatmap to centroid

Question/Issue: Exported trained FOMO .h5 model used in own script. Correct functionality and how to convert heatmap to centroid?

Project ID: 213240

Context/Use case: Hi! I have trained a FOMO object classification model to detect the absence of fuses on PCB’s. I want to deploy this model on a Xilinx KRIA FPGA and quantise and compile the model using the Xilinx VITIS AI libraries. Therefore I need the model in .h5 format. Natively EI does not support exporting this format, but by saving it to .h5 using expert mode and printing out the binary file contents in the console, I can copy and save it to a local file. The input of the model is a scaled down gray color image of 320x320, and the output is a 40x40x3 heatmap. I can run the model by importing the .h5 model file in an own script, but I cannot figure out how to do the heatmap to centroid conversion on the heatmap outputs. Additionally, I have the feeling that there is a lot of noise on the output heatmap, so it is difficult to detect blobs. Although the live classification results online give very good results! Can you support me in the post-processing of the heatmap? Do you expect that the exported .h5 model works like it is used in EI? Or am I missing a step which is not open information?

Image of the model working in the online classifier:

Image of the raw image, scaled and grayed, heatmap output:


@matkelcey, you may be able to give more details on the post-processing.

yeah, we do some post processing to threshold detections and then fuse [1] adjacent ones to effectively generate rectangular bounding boxes when there are adjacent detections.

you can see the exact code for this by generating a standalone lib (even without training) and look for ei_fill_result_struct.h ; it has fill_result_struct_f32_fomo which describes the iteration over the grid to fuse adjacent detections.

having said that, that heatmap you show here does look weird, and not just the fact that ( i think? ) it’s been transposed… it’s more than just fusing adjacent… was “red” => “fuse” and “green” => “no_fuse” ?


( [1] “fuse” adjacent detections; no pun intended with a project detecting fuses. )

1 Like

Hi @matkelcey

Thanks for the information, I’ll definitly have a look in the code you mentioned and try to implement the thresholding and fusing in my own script!

And indeed, I also have the feeling that the exported model does not equally behave like the one that is running online… I don’t think it is transposed because when there is a totally different image shown, the image can be recognised in the heatmap and follows the same directions when moving the image.

Does it make sense that you replicate my way of exporting the model and use it OR compare the used model in EI with the structure and values of the exported h5 model? The export code that I used can be found at the bottom of the expert mode code for training:"model.h5")
with open('./model.h5', 'br') as f:

And the code used to replace the binary h5 model values back to a h5 file:

graydata = b'x\89HDF\r\n\x1a...' # copied from EI training output console
with open('./greymodel.h5', 'bw') as f:

Best regards

I got the threshold detection and fusing working in Python, but the output is rubbish because the heatmap is probably not ok… Is it possible to check if the exported model is behaving correctly as mentioned in my previous post? Let me know what information you need.

Additional question: You do threshold detection and fuse adjacent outputs to create bounding boxes, but actually the output of FOMO classification is a single point with a label and confidence, to be as lightweight as possible. How to you then convert to a point? Take the centroid of that bounding box?

Thanks in advance!

BR Jonas


I fixed the problem, I forgot to normalize the image color value (from range 0-255 to 0-1).
Now the model, threshold detection and fusing works perfectly!

Best regards


Hello @JLannoo,

I’m glad you got it working.
If you wish/have time to provide a guide on how to do that, I am sure other people in the community would be interested. I’d be also happy to reference your guide.