Running inference on pre-made data

Hi All!

I’m very new to the Edge AI world, and am still figuring out the basics. I was training an image classification dataset on my Arduino Nano 33, but this board obviously has no built-in camera.

I wondered if there’s a way to send image data from my laptop to the Nano 33 BLE Sense. If yes, how do I go about this? Otherwise, is there a way to store a few images in header files on my microcontroller, and run inference on these images that are already locally stored (instead of trying to find sensor values).

I would realy appreciate it if you can help me out!

Hi @anon2617517 Luckily our Nano 33 tutorial shows you how you can connect the OV 7675 image sensor: https://docs.edgeimpulse.com/docs/arduino-nano-33-ble-sense. When you select the Arduino library on the Deployment page, there will be an example sketch all ready to go for use with this sensor when you expand the .zip file. You must of course train an image classification project and for that you can follow this: https://docs.edgeimpulse.com/docs/image-classification Hope this helps!

Hi! The idea is that I run the algorithm not by using an image sensor, but by sending a picture from my laptop to the Nano BLE (or having the picture on my Arduino locally). Is this possible?

@anon2617517
The edge impulse lib that you can auto create usually does not expect an image directly, but instead use a raw C array. Why do you not simply try to send the image as an array via UART to the Nano?

1 Like

@anon2617517,

Yes I would go with @Lukas solution. You can start implementing your uart receiver on top of the static inference Arduino example.

Regards,

Louis

Thanks a lot for the help! :smile: I was not aware of the option to manually import the libraries and make a “static_buffer” example. I got my network working now.

2 Likes