FOMO on ESP-EYE - Error at trained_model_invoke()

Question/Issue:
I have a FOMO project on an ESP-EYE. I have cloned the edge impulse firmware for ESP32 and replaced the trained model with my own exported C++ library from ei Studio.
When running the program, it fails at trained_model_invoke(). I’m kind of at a loss on why this is happening. I gather from the console output that is must be some kind of memory error, but I’m not sure on how to proceed.

The prebuilt binary from ei studio works just fine.

Hoping that there are some brighter minds in here willing to chip in.

Any help much appreciated!

Side note:
The model is for counting chickens in my henhouse :slight_smile:

main.cpp:

#include "driver/gpio.h"
#include "sdkconfig.h"

#include <stdio.h>

#include "ei_device_espressif_esp32.h"

#include "ei_at_handlers.h"
#include "ei_classifier_porting.h"
#include "ei_run_impulse.h"

#include "ei_analogsensor.h"
#include "ei_inertial_sensor.h"

EiDeviceInfo *EiDevInfo = dynamic_cast<EiDeviceInfo *>(EiDeviceESP32::get_device());

/* Private variables ------------------------------------------------------- */

/* Public functions -------------------------------------------------------- */

extern "C" int app_main()
{

    EiDeviceESP32 *dev = static_cast<EiDeviceESP32 *>(EiDeviceESP32::get_device());

    ei_printf(
        "Hello from Edge Impulse Device SDK.\r\n"
        "Compiled on %s %s\r\n",
        __DATE__,
        __TIME__);

    ei_start_impulse(false, false, false);
    ei_run_impulse();

    while (1)
    {
    }
}

Output from console:

Hello from Edge Impulse Device SDK.
Compiled on Oct 25 2022 10:02:51
Init camera OK!
Inferencing settings:
        Image resolution: 240x240
        Frame size: 57600
        No. of classes: 1
Starting inferencing in 2 seconds...
Taking photo...
Guru Meditation Error: Core  0 panic'ed (InstrFetchProhibited). Exception was unhandled.

Core  0 register dump:
PC      : 0x00000000  PS      : 0x00060330  A0      : 0x800e370c  A1      : 0x3ffbd370
A2      : 0x00000000  A3      : 0x3ffb6334  A4      : 0x3ffb6334  A5      : 0x3ffb750c  
A6      : 0x00000004  A7      : 0x00000001  A8      : 0x800ed40c  A9      : 0x3ffbd350
A10     : 0x3ffb750c  A11     : 0x3f4127e0  A12     : 0x3f4132d8  A13     : 0x00000009  
A14     : 0x40000000  A15     : 0x00000000  SAR     : 0x00000017  EXCCAUSE: 0x00000014
EXCVADDR: 0x00000000  LBEG    : 0x40087930  LEND    : 0x4008794c  LCOUNT  : 0x00000000  
0x40087930: memcpy at /builds/idf/crosstool-NG/.build/HOST-x86_64-w64-mingw32/xtensa-esp3

0x4008794c: memcpy at /builds/idf/crosstool-NG/.build/HOST-x86_64-w64-mingw32/xtensa-esp3



Backtrace:0xfffffffd:0x3ffbd3700x400e3709:0x3ffbd460 0x400dc6bb:0x3ffbd480 0x400dd0c0:0x32d8:0x3ffbd8b0 0x40115e33:0x3ffbd8d0
0x400e3709: trained_model_invoke() at C:/Users/mathi/firmware-espressif-esp32/tflite-mode

0x400dc6bb: _ZL20inference_tflite_runPK10ei_impulseyP12TfLiteTensorS3_S3_PhP19ei_impulse_dk/classifier/inferencing_engines/tflite_eon.h:128

0x400dd0c0: run_nn_inference_image_quantized(ei_impulse const*, ei::ei_signal_t*, ei_impuclassifier/inferencing_engines/tflite_eon.h:337

0x400dd2f5: process_impulse at C:/Users/mathi/firmware-espressif-esp32/edge-impulse-sdk/c
 (inlined by) run_classifier_image_quantized at C:/Users/mathi/firmware-espressif-esp32/e
 (inlined by) process_impulse at C:/Users/mathi/firmware-espressif-esp32/edge-impulse-sdk

0x400dd831: ei_run_impulse() at C:/Users/mathi/firmware-espressif-esp32/edge-impulse-sdk/
 (inlined by) ei_run_impulse() at C:/Users/mathi/firmware-espressif-esp32/edge-impulse/in

0x400ddb91: ei_start_impulse(bool, bool, bool) at C:/Users/mathi/firmware-espressif-esp32

0x400d72d8: app_main at C:/Users/mathi/firmware-espressif-esp32/main/main.cpp:68

0x40115e33: main_task at C:/Users/mathi/esp/esp-idf/components/freertos/port/port_common.





ELF file SHA256: 09570ae4ce3af48c

Regards
Mathias

Update.
For the heck of things I retrained and exported the C++ library from ei studio once again and copied the libraries to the project. After recompilation, things work flawlessly now.
I’m not quite sure what had happened, but the problem is gone now.

Mathias