Memory allocation tracking leak?

Hi, I noticed when looking at the output of the EIDSP_TRACK_ALLOCATIONS in the inferencing SDK deployed in my project that it seems to be reporting a memory leak. I haven’t seen my program crash so I don’t think it’s an actual memory leak but after every call to run_classifier there are always 236 bytes that are unaccounted for, so ei_memory_in_use increases by this amount each call to run_classifier.

my project ID is 20721

Hi @tennies, good call, this is actually because we allocate a 64 item matrix to do peak finding, but then resize it to the number of peaks found (e.g. 5). When free’ing this is not tracked correct, making it look like 64-5 = 59 * 4 = 236 bytes are left dangling. The bytes are freed though, and will push a fix to the SDK next week!

got it, thanks for looking at this!

1 Like