Video: Sheep activity tracker demo

During The Things Conference in Amsterdam this week we showed a demo of a TinyML sheep activity tracker, with Johan Stokking (The Things Industries CTO) as our special guest in the role of the sheep. This model was trained on 3-axis accelerometer data in Edge Impulse and runs on an 80 MHz Cortex-M4F MCU with 128K RAM - sending the inferencing results over LoRaWAN. Hope you find it an entertaining use case of embedded machine learning! Full talk will be available soon from The Things Network.


I have just watched this and the Continuous Motion Learning Video . This is very cool… and it
even looks like I may be able to do it! So the board is on order and we shall see if we can try it
on a real sheep.


Thanks @janjongboom, I was in that demo at TTC last week, it was one of the most entertaining demos at a conference I’ve ever seen.

I want to give this a try myself, did you use a B-L475E-IOT01A board with a Lora shield attached or something else? It’s the idea of using machine learning on a microcontroller board sending results to TTN I’m interested in. As well as learning how Edge Impulse works of course.

1 Like

Hi @paul, thanks a lot! Yes, this was built with a B-L475E-IOT01A, a SX1276 LoRa shield and a battery pack. All held together by a brace against a tennis arm!

Training was done by connecting the development board over WiFi and then streaming accelerometer data in, for the inferencing we exported the model to C++, and combined the model with the mbed-os-example-lorawan example app. Note that we use a sliding window for inferencing as well (take 4s. data, slice up in ~40 2s. windows, run inferencing on that) and send the summary over LoRaWAN.

1 Like

Cool, well I’ve got one of those boards arriving from Farnell tomorrow and I already have a dragino shield so hopefully I can make a start on getting something similar working myself.

1 Like

Yeah, that should work I think. From quickly looking at the diagram for the shield it should be compatible with the SX1276MB1xAS shield that I used. Make sure to set the right pins in mbed_app.json.

The ones for K64F should be OK already:


Very fun and cool demo! I want to try similar application using a nrf52832 board.
Can I do something similar using EdgeImpulse platform?

@janjongboom can you share details of this example? Are you going to do a tutorial similar to that of continuous movement?
Thanks in advance and congratulations.

Hi @miguelangelcasanova, the core principles are exactly the same as the continuous movement application with the same window settings and same blocks. Only sensor placement is different (on the arm). Sampling is done over WiFi, then inferencing was sent over LoRaWAN with a custom application and a LoRa shield. Probably not that hard to make it run on the nRF52832 (it even supports Mbed, so the inferencing part should be very easy to compile for the board).

1 Like

Thanks for your rapid response!

No problem! I’m excited to see what you’ll build.

(Also I have a strong love for the nRF5x boards! Using the nRF51-DK + Mbed five years ago made me realize that embedded development didn’t have to be so painful (coming from ASF + SAMD21), and that led to me working for Arm, and thus eventually to founding Edge Impulse :slight_smile: ).

1 Like