Yes, I got the program to compile without errors and was able to download the static_buffer example to the ESP32 CAM.
Thanks!!
Now I need to figure out how to run the model that I created!
Yes, I got the program to compile without errors and was able to download the static_buffer example to the ESP32 CAM.
Thanks!!
Now I need to figure out how to run the model that I created!
I created my image object recognition application, which I exported to the Arduino for the ESP32CAM,
So how do I get my data into the application?
When I run the application, I get the following message in the Arduino IDE Serial Port Monitor:
Edge Impulse standalone inferencing (Arduino)
The size of your ‘features’ array is not correct. Expected 9216 items, but had 0
Thanks…
You need to fill the features[] array with your image RGB values.
I’d suggest to proceed with following steps:
Once it works, you can proceed with filling the features array directly from your ESP32Cam (you should capture a 96x96 pixels image) and check results for 1 image.
Final step is to fill the features array dynamically in the loop() so your inference can run continuously.
Aurelien
@tcontrada After you’ve managed to get it working based on @aurel’s comments the prefered way of getting data in is by implementing the signal.get_data
callback:
int raw_feature_get_data(size_t offset, size_t length, float *out_ptr) {
// Here read `length` bytes from offset `offset` from the camera's *frame buffer*
// then copy into `out_ptr` in the format from Studio
// (1 pixel per value like 0xFF0000 for red)
return 0;
}
signal_t features_signal;
features_signal.total_length = EI_CLASSIFIER_INPUT_WIDTH * EI_CLASSIFIER_INPUT_HEIGHT;
features_signal.get_data = &raw_feature_get_data;
This way you don’t allocate another large buffer, but can stream straight from frame buffer => neural network.
Well, call me stupid, but I am not able to understand what you want me to do!
Can you be more specific?
Thanks…
In the Arduino sketch, you have the following array assignment:
static const float features[] = { ... }
It needs to be filled with the pixels from the image you want to classify. First step is just to make sure it works, by pasting an image example directly from your Edge Impulse project. Then you can use the ESP32Cam method to capture a frame (probably with esp_camera_fb_get() but I’m not familiar with this target).
Aurelien
I filled the array with a random example from my project.
I get the following debug output:
ets Jun 8 2016 00:22:57
rst:0x1 (POWERON_RESET),boot:0x13 (SPI_FAST_FLASH_BOOT)
configsip: 0, SPIWP:0xee
clk_drv:0x00,q_drv:0x00,d_drv:0x00,cs0_drv:0x00,hd_drv:0x00,wp_drv:0x00
mode:DIO, clock div:1
load:0x3fff0018,len:4
load:0x3fff001c,len:1216
ho 0 tail 12 room 4
load:0x40078000,len:9720
ho 0 tail 12 room 4
load:0x40080400,len:6352
entry 0x400806b8
Edge Impulse Inferencing Demo
Edge Impulse standalone inferencing (Arduino)
Guru Meditation Error: Core 1⸮ets Jun 8 2016 00:22:57
rst:0x1 (POWERON_RESET),boot:0x13 (SPI_FAST_FLASH_BOOT)
configsip: 0, SPIWP:0xee
clk_drv:0x00,q_drv:0x00,d_drv:0x00,cs0_drv:0x00,hd_drv:0x00,wp_drv:0x00
mode:DIO, clock div:1
load:0x3fff0018,len:4
load:0x3fff001c,len:1216
ho 0 tail 12 room 4
load:0x40078000,len:9720
ho 0 tail 12 room 4
load:0x40080400,len:6352
entry 0x400806b8
Edge Impulse Inferencing Demo
Edge Impulse standalone inferencing (Arduino)
Guru Meditation Error: Core 1 panic’ed (StoreProhibited). Exception was unhandled.
Core 1 register dump:
PC : 0x400ec2b2 PS : 0x00060730 A0 : 0x800ee1cd A1 : 0x3ffb1e10
A2 : 0x3ffc647c A3 : 0x00000040 A4 : 0x00045810 A5 : 0x00000001
A6 : 0x00000040 A7 : 0x400ec254 A8 : 0x00044f98 A9 : 0x00037230
A10 : 0x00000002 A11 : 0x00000002 A12 : 0x00000008 A13 : 0x3ffb1e64
A14 : 0x00000004 A15 : 0x00000002 SAR : 0x0000000a EXCCAUSE: 0x0000001d
EXCVADDR: 0x00045810 LBEG : 0x4000c2e0 LEND : 0x4000c2f6 LCOUNT : 0xffffffff
Backtrace: 0x400ec2b2:0x3ffb1e10 0x400ee1ca:0x3ffb1e30 0x400ec14d:0x3ffb1e90 0x400eb925:0x3ffb1ec0 0x400ebbfe:0x3ffb1f00 0x400ebe81:0x3ffb1f50 0x400e6931:0x3ffb1fb0 0x4008a53d:0x3ffb1fd0
It looks like the program tries to access some prohibited memory space but you’ll need to decode the backtrace to understand where the issue comes from. This plugin seems to do the work: https://github.com/me-no-dev/EspExceptionDecoder
Also what is the RAM and Flash capacity on your board?
Aurelien
I’ve ordered an ESP32-CAM board to try this out - hopefully that will work faster.
In the configuration for Huge app on the ESP32 CAM, max flash is 3145728, max ram is 327680
Is this enough to run the application? The CAM board has a micro SD card slot available as well…
Great Idea!!
When you load the ESP32 board files there is a sample application, CameraWebServer, try that out first to make sure everything works.
Thanks!
Oh, you will also nee a USB to Serial adapter, like the FTDI board. Set the voltage switch to 5v.
See this link: https://randomnerdtutorials.com/program-upload-code-esp32-cam/
Yeah that should be fine - will check if I have the board in!
Hi Any update on the ESP32 CAM?
@tcontrada I have the board laying around but haven’t gotten to use it yet. In the meantime we had a few users that have used the ESP32 CAM, incl. here: ESP32WROOM or ESP8266EX Node MCU Support and I’ve written an example to create a signal from a RGB565 frame buffer (which I think the ESP32 CAM uses): https://github.com/edgeimpulse/example-signal-from-rgb565-frame-buffer
@janjongboom I just received an ESP32 cam. I’m trying to learn how the data is classified. Does the software support classification of live streaming data or is it just classifying single pictures/frames?
Frame-by-frame, I don’t really get how you would do live streaming of the data. The buffer should remain the same until classification is done, then you can read from the camera again.
Good point, I didn’t think of that. I lack that basic understanding on how data is read from the camera and how that data is structured which I’m currently researching.
I also read your github. So I need to convert the RGB565 to RGB888 then I can feed it into the static buffer?
@electronictechgreg Yes, just use the example repo to turn the RGB565 frame buffer into a signal_t
structure. You then ask the classifier to classify the signal. Then you get predictions, and you can take the next photo!
I watched your video on Adding sight to your sensors. When I was talking about streaming what I meant was live classification like you did in the video. So that answered that question for me.
As for the ESP32 cam, apparently I can output in RGB888 format using the fmt2rgb888 function from the esp_camera.h
library. Then convert it to a signal_t
structure then feed it into the classifier. I think I got it.