Retrieve raw prediction values

How can we access the raw prediction values array using eim or C++ SDK? The runner.classifier (python) or run_classifier (C++) function only provides the label, timing and scores. I wanted to access the raw values which is returned by the inferencing. For example,

pred = model.predict([data])

@naveen What do you mean with ‘raw prediction values’ here? E.g. unprocessed results for FOMO? So one way to do it is to skip the run_classifier() function altogether and just invoke the DSP / ML functions yourself. E.g.:

E.g. for full TFLite you’ll see float* out_data = interpreter->typed_output_tensor<float>(0); here which gives you the output tensor.

That actually reminds me of your other question. If you have an EON compiled model you can just inspect the intermediate layers by hooking in the trained_model_invoke function in the model file: example-standalone-inferencing-linux/trained_model_compiled.cpp at d4b4dd61ca23d4d5e01e54286ed811ba2173cb13 · edgeimpulse/example-standalone-inferencing-linux · GitHub (here it runs to the end, but you could end this earlier, and copy the intermediate results out) - not as straight forward as in TensorFlow / Python :slight_smile:

1 Like

Hi @janjongboom, thanks for the suggestions! I have solved the intermediate layer access problem by modifying model as a multi-output model. Now the intermediate layer weights are available at the top layer. I just need to access the top layer weights to get both. I will see if the interpreter->typed_output_tensor<float>(0) contains the weights. By the way what should be the flow for calling run_inference? Should I directly pass the *fmatrix filled with the normalized RGB cropped image data since there are no other signal processing DSP blocks?