Deploying a model on the Teensy4.0 using Accelerometer

Question/Issue:
So in my project, I have a dataset consisting of IMU data (x, y, z) using the MPU6050 connected to a Teesny 4.0. Since edge impulse does not support the teensy, I am using the C++ library development to run my model. I am trying to get that up and running, however I am quite confused how to actually do this. I am following shawn myrells video (link: https://www.youtube.com/watch?v=T6tKej6BTIs) on how to deploy using the C++ library but I am very confused because he mentions, I have to create user defined functions that are not defined by edge impulse. This is also found in this link: https://docs.edgeimpulse.com/reference/inferencing-sdks/cpp-inferencing-sdk/api-ei_user_functions

Would someone be able to guide me on how to set up the IMU to read x y and z on the teensy board using edge impulse. Here is my project for reference: PianoAir - Dashboard - Edge Impulse

Context/Use case:
I am trying to run the model I created.

Steps Taken:

  1. [Step 1]
    I tried to use the Nano 33BLE and configure that for the teensy using https://mltools.arduino.cc. However I could not figure out how to integrate that to teensy

  2. [Step 2]
    I tried just to use tensorflow lite and upload the model.h following an example code for running the MPU6050 found in this link: https://survivingwithan.medium.com/arduino-tinyml-gesture-recognition-with-tensorflow-lite-micro-using-mpu6050-f9d4a11f17b5
    But I could not figure out how to use the data I gathered in Edge Impulse so I am now trying again to use the C++ library.

  3. [Step 3]
    If those steps dont work out, I am planning to just try step 2 again but with data formatted according to that

Expected Outcome:
[Describe what you expected to happen]
I expect to have the model run on the teensy board I have using the MPU6050 IMUs.

Actual Outcome:
[Describe what actually happened]
I am stuck on the integration of using my edge impulse model on the Teensy.

Hi @Chanuth,

If you want to use the C++ library on an unsupported board, you will need to port the code in the library for that board (high level overview of porting here: Porting Guide | Edge Impulse Documentation). Generally, that entails writing your own implementations for the functions listedhere: https://docs.edgeimpulse.com/reference/inferencing-sdks/cpp-inferencing-sdk/api-ei_user_functions

To understand how to use the C++ library, I highly recommend working through this guide here: As a generic C++ library | Edge Impulse Documentation. Once you have an understanding of how it works (e.g. on a Linux system), you can then work on porting it for your particular board (Teensy), which means writing the implementations for the functions mentioned above: https://docs.edgeimpulse.com/reference/inferencing-sdks/cpp-inferencing-sdk/api-ei_user_functions

Hope that helps!