Thingy52 for continuous motion recogntion

Love your product! I played around with it in the past but now I’m trying to use it for a specific use case. I am trying to get the Nordic Thingy52 to work with EI for continuous motion recognition. Ideally for both data capture and inference. I specifically need the Thingy52 and not Thingy53 because nrf52 is going to be in the final product. I had a look at your NRF52 DK example but it’s clanky as not only the board is massive and needs to be powered, and that is even before you consider that you need to get a separate shield for the IMU, whereas Thingy52 has it all in one neat package. So my question is, is there a good way to capture the data from Thingy52 IMU and then also deploy to it with Zephyr? The latter I can probably do with relative ease but the capturing of the data with the Thingy52 is what I’m somewhat confused about, so tips would be welcome!

I have only recently started with Zephyr but so far I managed (without EI) to get sensor data off the Thingy52 and (with Zeyphyr and NRF Connect SDK) into the Bluetooth app on iOS (and Apple Vision Pro incidentally).

Hi @mentar,

My preference for capturing data with devices that are not officially supported by Edge Impulse is to store the data to an SD card on device or pipe it across USB to write to CSV files on my computer. From there, use our CSV wizard to read in the data.

Alternatively, you could send raw data over USB serial and upload it directly to a project using the CLI uploader tool.

In both cases, you will need to write some custom firmware (e.g. with Zephyr) that runs on the Thingy52.

Oh the CSV import is an interesting idea, thank you! I did some messing around with ble-serial and looks like it may work best for my use case.

I was actually thinking, it may make more sense to stream the data to the iOS app for inference instead of doing it on mcu device (will be easier to push out new models/deployments too). Looking at the docs the only option for the iPhone is wasm to run in the browser. I doubt that will work with the Unity setup that we have for our app. Is there a better option you’d recommend to do spectral analysis, NN, and anomaly detection on the iPhone with EI?

Hi @mentar,

You can either run the impulse from your EI project (feature extraction, NN, anomaly detection) on the Thingy52 directory or on the phone. Your options for running it on the phone include WebAssembly (as you noted) or converting the C++ library to Swift (I’m not familiar with iOS development, but I have to guess this is possible).