Thingy 53 disconnect after ML model upload

My brand new Thingy 53 is not working as expected. I´m using nRF Edge Impulse app in my iPhone 11. After the deployment of my Edge Impulse model to the device, the bluetooth connectivity is lost so I cannot see the results of inferencing. I get a message of “Error: Connection timed out” in the app. Someone could help me with this topic? I appreciate it! Thanks in advance!

1 Like

Hi @aabejaro

The T53 is an awesome board, but there was a recent app update and firmware update. You will need to refresh the firmware. There are lots of new features and SDK support that have been recently added.

Once you have refreshed if the issue persists you may have a project already connected so try the following:

  1. Reconnect Device: Try reconnecting your Thingy:53 by selecting your device name on the Devices tab and clicking "Reconnect"​​.
  2. Check Power Connection: Ensure that the power cables are properly plugged in.
  3. Avoid Multitasking on App: Do not use iPhone/Android app multitasking during data acquisition, firmware deployment, or inferencing tasks, as this can disrupt the BLE streaming connection​​.
  4. Reconnect and Restart Thingy:53: Disconnect the Thingy:53, switch it off and on again, then reconnect it to your computer using a USB-C cable. Run edge-impulse-daemon --clean, and follow the steps to reconnect to the new Edge Impulse project​​.

Best

Eoin

Hi, @Eoin

Just tried all your points but the same results. A bit frustrating because there is no reason why Thingy 53 cannot connect again after ML model upload. Is a board or firmware related problem? Where could I get more info about this problem? I have noticed more users are facing the same issue…

Hi @aabejaro

If you are unable to flash a current or previous version of the firmware remotely you may need to flash via a debugging cable.
https://developer.nordicsemi.com/nRF_Connect_SDK/doc/latest/nrf/device_guides/working_with_nrf/nrf53/thingy53_gs.html#updating-through-external-debug-probe

@vojislav are you aware of any issue with model updates?

Best

Eoin

Hi @aabejaro

can you please share your project ID.
Also if you can please share your Thingy53 board version, you can probably see in on the box it came in or if you open the device case.

Thank you.

best regards,
Vojislav

Hi @aabejaro

I was able to reproduce the issue, fix is in the review and will land in production soon.

Best regards,
Vojislav

Nice! Waiting for the new development to test Thanks! @vojislav

Hi @aabejaro ,

the new release is out. Please try it out to confirm that everything is ok.
Since you are not able to connect to the device and update the binary over BLE, the best way would be putting the device in the bootloader mode (Here is how to do that) and flash it over USB connection from your computer.
After this flash you will be able to use normal BLE workflow as before.

Thanks for your patience.

Best regards,
Vojislav

Thank you @vojislav Now is running ok and I´m able to connect to my Thingy53 after model upload BUT the Edge Impulse app don´t show the results of the inferencing :frowning:

ok, for that you need to share with me your project ID. The gestures model I tested with shows the inferencing results in the mobile application.

Hi, @vojislav Thanks for all your support!!! My project ID is 93036.

Hello @vojislav Any updates? The problem appears in ALL the projects. Impossible to visualize inference results in the Edge Impulse app.

Hi @aabejaro
I just added my self to your project for testing (you will see my Thingy53 in your project device list).

It seems that your model is now to big to fit into Thingy53 memory and that is why you are getting no results.
This is what I get if I run the model using our CLI tools:
ERR: Could not allocate audio buffer (size 64000), this could be due to the window length of your model

Keep in mind that this does not mean your model cannot run on nRF5340 SoC, it is jut that available memory for model with the thingy firmware is a bit more limited due to all the other features.

For testing your model I can recommend standalone inferencing for zephyr: GitHub - edgeimpulse/example-standalone-inferencing-zephyr

and maybe @Eoin can give you some ideas what and how to do thing to make your model smaller.

best,
Vojislav

1 Like

Hello, @vojislav Thanks for your fast response and highlight the problem. I continue with my tests with my Thingy53 board. Keep you updated!