I have got a weird result after training my model. Normally I would understand that overfitting is when the model is very good at predicting the training data however, because the model fits so perfectly against the training data it fails to generalize accurately on unseen data (test data). How would you call when the model is better at the test data? Is this possible or maybe this results are from an error/mistake?
Additional information. The images are 160x160 and for this model I’m running with transfer learning MobileNetV2 96x96 0.35 (16neurons & 0.1 Dropout rate). However, when I change the w*h in the neural architecture to 160x160 I do see a great improvement in the accuracy of the training data set. But, the funny thing is that I have another model with 160x160 but using the MobileNetV2 96x96 and it has a great accuracy (project:216037).
This is related to the same problem you were facing from the last question, it’s not a trivial task to detect these subtle differences.
The case you describes indicates that the test set might be easier for the model to predict than the training set, due to some systematic differences between the two. This could happen if the training set is more diverse or contains more challenging examples, while the test set is more representative of simpler cases.
I suspect you will need 1000’s of well curated images to train this on, as Louis suggested here you may need to look into syntetic data generation.
Thanks for the feedback!
I would like to add that I made a version of the project, for some reason the %value showed in the training accuracy is different from the one displayed on the confusion matrix. The value was 61.7% I don’t know what happened there.
Additionally, I tried to inference with the deployed version and I acquired some useful observations. I believe this model is not actually observing the white dust on the parts but in reality is observing the geometry and orientation of the part. I will perform a saliencymap and gradcam in a few images to try to see if my observation is correct.