How can we access the raw prediction values array using eim or C++ SDK? The runner.classifier (python) or run_classifier (C++) function only provides the label, timing and scores. I wanted to access the raw values which is returned by the inferencing. For example,
@naveen What do you mean with ‘raw prediction values’ here? E.g. unprocessed results for FOMO? So one way to do it is to skip the run_classifier() function altogether and just invoke the DSP / ML functions yourself. E.g.:
E.g. for full TFLite you’ll see float* out_data = interpreter->typed_output_tensor<float>(0); here which gives you the output tensor.
Hi @janjongboom, thanks for the suggestions! I have solved the intermediate layer access problem by modifying model as a multi-output model. Now the intermediate layer weights are available at the top layer. I just need to access the top layer weights to get both. I will see if the interpreter->typed_output_tensor<float>(0) contains the weights. By the way what should be the flow for calling run_inference? Should I directly pass the *fmatrix filled with the normalized RGB cropped image data since there are no other signal processing DSP blocks?