Changing decision threshold to maximise recall of rare events


I am trying to train a model on Edge Impulse to run on an AudioMoth acoustic sensor. Initially I want to train a binary model, which will classify presence or absence of rare gunshot events, and then record the gunshot and not record the non gunshot sounds. As these events are rare, I want to maximise recall at the cost of increased false positives, as the aim is to then analyse the data more thoroughly after it has been recorded. When training models in Python, I have achieved this by selecting the model which has the best recall for gunshots, and then choosing a decision threshold for the presence score that will maximise the classification of gunshot events (I chose the threshold with the maximum F2 score).

I cannot see a way to do this in Edge Impulse. I can see that when I test the model there is an option to set the decision threshold, but when I do this, I end up with a lot of classifications that have both categories, and are consequently not classified correctly. What I want is to focus only on the ‘presence’ score, and anything above the threshold will be classified as present, as opposed to getting a lot of ‘unknowns’ or lots of 1 gunshot + 1 background scenarios. In reality, every sound has background in it, so I don’t care about this at all - I just want to focus on if a gunshot appears.

Thank you for any advice you can offer on this!


Hi Lydia,

A good approach to optimize your model would be to use our EON Tuner. Once it has completed, you can sort models out based on the Recall score and select the best model for your application.

You can also lower the decision threshold in Model Testing but it will also increase the detection of other events. You can have multiple labels detected for a single test sample if the length is larger than your window size, this is expected especially as the gunshot in a short event so the next windows will be detected as noise/unknown. Once you deploy the code on the AudioMoth, you can just check if the prediction is a gunshot, other predictions won’t matter.


Does changing the decision threshold in model testing, also change the threshold in the deployed model?

@jmak14 For Edge Impulse Audio Models, the EI Studio Testing page Set Confidence Thresholds does not affect the deployed Model. To see what constants get deployed see file model_metadata.h in the deployed EI library.