How to extract intermediate layer?

I want to access hidden layers, can this be possible when using Edge Impulse implementaion?
My device is Arduino nano 33 BLE Sense. I’ve trained a CNN model and output a deployment zip package. By import this zip package, I can only call the inferencing method (result.classification.value) and get classification result.
But I don’t know how to get the hidden layer, My model have 2 Dense layers for output, I can only get the last Dense layer (that is also the inference result). How to get the other Dense layer’s values?

Hello @microa,

Good question, I usually do that using the tflite file where I copy some extracted features when I need to test. Let me check with our embedded team if / where we can output intermediate layer values in our C++ inferencing SDK.


If you use EON compiled model, you can print the intermediary layers values out by adding
to the top of tflite-model/tflite_learn_*_compiled.cpp, right after #include statements.

You can look at ei_printf code to see how you can obtain the values, if you need to use them for something, e.g in function TfLiteStatus tflite_learn_*_invoke().

1 Like

Thanks, Maybe I should learn how to use tflite. And if there’s SDK can be called directly, please let me know.


Thanks, I didn’t use EON and seem you are using tflite, maybe I should figure out how to use tflite instead of SDK.

Currently I revise my model to output last 2 layers, and set loss function focused on only last layer, so I get a SDK output not only last layer, but also the layer before the last layer.

It resolved my problem for now. However, if you want to get the random layer, this method would not be useful.

I have to process the urge project, so let me put it here, I would explore it again and share results when I not so busy as now.

All in all, thanks very much for @louis and @AIWintermuteAI 's help.

Extracting hidden layer outputs directly from Edge Impulse libraries isn’t currently supported. You trained a Convolutional Neural Network (CNN) with Edge Impulse, and the deployment package only provides the final classification result.

There’s a workaround though! You can access these hidden layer values during development on your computer. Here’s the idea:

1- Extract the TFLite model file from your deployment package.
2- Use libraries like TensorFlow on your computer to load the model and write code to capture the outputs of the desired hidden layer.

This won’t work on your Arduino Nano directly, but it helps analyze your model’s inner workings on your development machine.

1 Like

Hi @hobertbranch,

Thanks for the workaround! Something else that might be worth trying is downloading the “TensorFlow SavedModel” file from the Dashboard in your project. You can then load that into TensorFlow (e.g on your computer or on Colab) and load/change/modify layers as you wish. I do exactly that to create a GradCAM visualization demo here: computer-vision-with-embedded-machine-learning/2.3.1 - CNN Visualizations/ei_saliency_and_grad_cam.ipynb at master · ShawnHymel/computer-vision-with-embedded-machine-learning · GitHub