I noticed that in the Deployment tab of the last version of EIS the confusion matrices of the quantized and unoptimized models of the impulse are no longer available. While the confusion matrix of the unoptimized model can be retrieved in the Model testing tab, how can I get the one related to the quantized model?
I don’t recall we used to show the confusion matrix on the deployment tab.
You can see the confusion matrix for classification projects on the Classifier tab. These include both the quantized version of your model and the float32.
If I’m not wrong, in the older versions of EIS, within the Deployment tab, confusion matrices related to the models (i.e., quantized and float32) performances on the test set were shown.
Concerning your suggestion, the confusion matrices that are displayed on the Classifier tab are related to the models performances on the validation set, aren’t they?
On the other hand, the confusion matrix on the Model testing tab does refer to the test set, but just for the float32 model, isn’t it?
What if I would like to assess the quantized model performance on the test set via the relative confusion matrix? Is it possibile in EIS?
We have an opened issue for adding quantized models in model testing. The main reason we don’t right now is because the int8 kernels in the Python TensorFlow Lite (which we use here) are unoptimized and incredibly slow, meaning model testing would also be very slow.
That being said, there are some active discussions internally to fix this.
I fully understand your point. However, I’m very looking forward to have this feature available again on EIS (I deem it’s extremely handy and useful). Please, let us know as soon as it will be available once again.