Build TinyML models using your mobile phone

We can now build tiny machine learning (TinyML) models that interpret sensor data in realtime, from detecting lions roaring to tracking sheep workouts. But to get started building these models you need a development board, a cross-compilation toolchain, and knowledge about embedded development. Can't we do better? We already have a very capable device with multiple high-quality sensors in our pockets, and it's the perfect device to build your first TinyML model.

This is a companion discussion topic for the original entry at

I have created a new model to run on my mobile phone. With the live classification test, I got an issue that the frequency the model expecting was off bases on its calculation. All of the trained data based on the 48KHz setting. Attached is the error message. Please help to check for the root cause. I’d also like to adjust the classification confident threshold to evaluate the performance with the live input data. Please let me know how to configure it. Thanks.

Hi @pvinhha, yeah, this is an oversight from a patch that we released last week. We’re correcting it in todays release.

@pvinhha @paulphilip This has now been released.

@janjongboom Had expected both SDK and the web interface is updated. Thanks for notifying.

Hi Paul, the live classification with my phone is working now. Thanks.