Calculating RAM Usage


I want to calculate RAM Usage of my models after their deployment on the hardware (Nano 33 BLE Sense). However i’m not sure how to…
I am using above method to trace the memory. However i get 4 values…

Current Heap
Reserved Heap

Max Size stack
Reserved stack

What values out of them would actually relate to my current RAM consumption? For example on Edge impulse i’m getting an estimate of 16KBs… so what values out of them would make something similar to this? or true values?
I know heap is related to the global values and stack to locals somehow… but i don’t know much!

I am really looking forward to some answers…

Hello @rida,

Can you share your custom code where you print the memory info?



I’m not using any custom code… i’m just trying it on audio classification example (not continuous).
I’m printing memory info before and after inference start… then before and after running classifier. The readings are constant around 43K for all the models… and at every point of code.
I’m assuming that’s just the heap assigned to the recorded raw samples…

The NN classifier frees all the heap once it classifies the sample. So i can’ t track it…

Only the tf arena size is arounfd 13 K according to the cpp file of trained model… (which is close to 16KB)

The NN classifier frees all the heap once it classifies the sample. So i can’ t track it…

Indeed, you’ll need to check the memory from inside these functions. It depends on how you compile the SDK (EON compiler, quantized version, unoptimized version).

Search for this file in the SDK, you’ll have a better understanding:

and then for the non-continuous, search for the function run_nn_inference

You can also check the DSP if you’d like.



One other question… when a code uploads to arduino and it says for example…
Global variables use 39% of the dynamic memory… does it also have the tensor arena in it already? or is it assigned only when running classifier?