I’m very new to the Edge AI world, and am still figuring out the basics. I was training an image classification dataset on my Arduino Nano 33, but this board obviously has no built-in camera.
I wondered if there’s a way to send image data from my laptop to the Nano 33 BLE Sense. If yes, how do I go about this? Otherwise, is there a way to store a few images in header files on my microcontroller, and run inference on these images that are already locally stored (instead of trying to find sensor values).
I would realy appreciate it if you can help me out!
Hi! The idea is that I run the algorithm not by using an image sensor, but by sending a picture from my laptop to the Nano BLE (or having the picture on my Arduino locally). Is this possible?
@anon2617517
The edge impulse lib that you can auto create usually does not expect an image directly, but instead use a raw C array. Why do you not simply try to send the image as an array via UART to the Nano?
Thanks a lot for the help! I was not aware of the option to manually import the libraries and make a “static_buffer” example. I got my network working now.