Question/Issue:
I’m trying to deploy my visual anomaly detection model via Linux (AARCH64) so that I can run it on my Raspberry Pi 5, but the deployment fails repeatedly. I’ve also tried to running “edge-impulse-linux-runner” on my pi and building/downloading the model that way, but I have the same issue.
Here is the deployment error message I get from Edge Impulse:
"Creating job… OK (ID: 41458997)
Job scheduled at 09 Dec 2025 07:27:54
Job started at 09 Dec 2025 07:27:57
Writing templates…
Job scheduled at 09 Dec 2025 07:28:10
Job started at 09 Dec 2025 07:28:21
Exporting TensorFlow Lite model…
Exporting TensorFlow Lite model OK
Removing clutter…
Removing clutter OK
Copying output…
Copying output OK
Job scheduled at 09 Dec 2025 07:28:27
Job started at 09 Dec 2025 07:28:28
Building binary…
aarch64-linux-gnu-g++ -Wall -g -Wno-strict-aliasing -I. -Isource -Imodel-parameters -Itflite-model -Ithird_party/ -Iutils/ -Os -DNDEBUG -DINCBIN_SILENCE_BITCODE_WARNING -DSILENCE_EI_CLASSFIER_OBJECT_DETECTION_COUNT_WARNING=1 -g -DEI_CLASSIFIER_USE_FULL_TFLITE=1 -Iedge-impulse-sdk/tensorflow-lite -DDISABLEFLOAT16 -std=c++17 -c source/main.cpp -o source/main.o
In file included from ./tensorflow-lite/tensorflow/lite/core/subgraph.h:39,
from ./tensorflow-lite/tensorflow/lite/core/async/async_subgraph.h:27,
from ./tensorflow-lite/tensorflow/lite/core/async/async_signature_runner.h:24,
from ./tensorflow-lite/tensorflow/lite/core/interpreter.h:45,
from ./tensorflow-lite/tensorflow/lite/interpreter.h:21,
from ./edge-impulse-sdk/classifier/inferencing_engines/tflite_full.h:45,
from ./edge-impulse-sdk/classifier/ei_run_classifier.h:65,
from source/main.cpp:45:
./tensorflow-lite/tensorflow/lite/graph_info.h:125:1: warning: multi-line comment [-Wcomment]
// /------------
^
./tensorflow-lite/tensorflow/lite/graph_info.h:137:1: warning: multi-line comment [-Wcomment]
// /------------
^
In file included from ./edge-impulse-sdk/classifier/ei_run_classifier.h:95,
from source/main.cpp:45:
./model-parameters/model_variables.h:147:18: error: ‘undefined’ was not declared in this scope
.threshold = undefined,
^~~~~~~~~
./model-parameters/model_variables.h:147:18: note: suggested alternative: ‘unsigned’
.threshold = undefined,
^~~~~~~~~
unsigned
source/main.cpp: In function ‘void json_send_classification_response(int, uint64_t, uint64_t, uint64_t, EI_IMPULSE_ERROR, ei_impulse_result_t*, bool, char*, size_t)’:
source/main.cpp:381:23: warning: comparison of integer expressions of different signedness: ‘int’ and ‘size_t’ {aka ‘long unsigned int’} [-Wsign-compare]
if (bytes_written > resp_buffer_size) {
~~~^
Exporting TensorFlow Lite model OK
Removing clutter…
Removing clutter OK
Copying output…
Copying output OK
make: *** [Makefile:234: source/main.o] Error 1
Application exited with code 2
Creating deployment failed
Application exited with code 1
Job failed (see above)"
Project ID:
745698
Context/Use case:
Deploying model directly in Edge Impulse, or indirectly through Raspberry Pi.
Steps Taken:
- Retry through Edge Impulse
- Confirm all parameters and thresholds have a value attached
- Attempt indirect deployment via Raspberry Pi
Expected Outcome:
The deployment to succeed by giving me a .eim file
Actual Outcome:
Failed deployment
Reproducibility:
- [ ] Always
Environment:
- Platform: [Raspberry Pi 5]
- Build Environment Details: [e.g., Arduino IDE 1.8.19 ESP32 Core for Arduino 2.0.4]
- OS Version: [Windows 11 Enterprise]
- Project Version: [1.0.0]
-
Custom Blocks / Impulse Configuration: [Visual anomaly detection with image data]
Logs/Attachments:
[Creating job… OK (ID: 41458997)
Job scheduled at 09 Dec 2025 07:27:54
Job started at 09 Dec 2025 07:27:57
Writing templates…
Job scheduled at 09 Dec 2025 07:28:10
Job started at 09 Dec 2025 07:28:21
Exporting TensorFlow Lite model…
Exporting TensorFlow Lite model OK
Removing clutter…
Removing clutter OK
Copying output…
Copying output OK
Job scheduled at 09 Dec 2025 07:28:27
Job started at 09 Dec 2025 07:28:28
Building binary…
aarch64-linux-gnu-g++ -Wall -g -Wno-strict-aliasing -I. -Isource -Imodel-parameters -Itflite-model -Ithird_party/ -Iutils/ -Os -DNDEBUG -DINCBIN_SILENCE_BITCODE_WARNING -DSILENCE_EI_CLASSFIER_OBJECT_DETECTION_COUNT_WARNING=1 -g -DEI_CLASSIFIER_USE_FULL_TFLITE=1 -Iedge-impulse-sdk/tensorflow-lite -DDISABLEFLOAT16 -std=c++17 -c source/main.cpp -o source/main.o
In file included from ./tensorflow-lite/tensorflow/lite/core/subgraph.h:39,
from ./tensorflow-lite/tensorflow/lite/core/async/async_subgraph.h:27,
from ./tensorflow-lite/tensorflow/lite/core/async/async_signature_runner.h:24,
from ./tensorflow-lite/tensorflow/lite/core/interpreter.h:45,
from ./tensorflow-lite/tensorflow/lite/interpreter.h:21,
from ./edge-impulse-sdk/classifier/inferencing_engines/tflite_full.h:45,
from ./edge-impulse-sdk/classifier/ei_run_classifier.h:65,
from source/main.cpp:45:
./tensorflow-lite/tensorflow/lite/graph_info.h:125:1: warning: multi-line comment [-Wcomment]
// /------------
^
./tensorflow-lite/tensorflow/lite/graph_info.h:137:1: warning: multi-line comment [-Wcomment]
// /------------
^
In file included from ./edge-impulse-sdk/classifier/ei_run_classifier.h:95,
from source/main.cpp:45:
./model-parameters/model_variables.h:147:18: error: ‘undefined’ was not declared in this scope
.threshold = undefined,
^~~~~~~~~
./model-parameters/model_variables.h:147:18: note: suggested alternative: ‘unsigned’
.threshold = undefined,
^~~~~~~~~
unsigned
source/main.cpp: In function ‘void json_send_classification_response(int, uint64_t, uint64_t, uint64_t, EI_IMPULSE_ERROR, ei_impulse_result_t*, bool, char*, size_t)’:
source/main.cpp:381:23: warning: comparison of integer expressions of different signedness: ‘int’ and ‘size_t’ {aka ‘long unsigned int’} [-Wsign-compare]
if (bytes_written > resp_buffer_size) {
~~~^
Exporting TensorFlow Lite model OK
Removing clutter…
Removing clutter OK
Copying output…
Copying output OK
make: *** [Makefile:234: source/main.o] Error 1
Application exited with code 2
Creating deployment failed
Application exited with code 1
Job failed (see above)]
Additional Information:
[Any other information that might be relevant]