FOMO object detection on Beaglebone Black

Hi. I wish to use a beaglebone black as a FOMO object detection inferencing server taking input from esp32 cams ip streams. So my question is, is the hardware capable of doing it and if so, which supported board should I train the model on in the edge impulse studio?

The Beagle is more than capable of running FOMO.

You then can deploy a:

  • Linux ready-to-go firmware, or
  • a C++ library, or
  • a WASM

To deploy a Linux ready-to-go firmware in the Edge Impulse Studio goto:

  • the Dashboard page,
    • scroll to the bottom of the page
    • check the Show Linux Options checkbox
    • click the Save Experiments button
  • the Data Acq page
    • import data
  • design, train and test your model
  • the Deployment page
    • select the Linux firmware you want to build
    • click the Build button

Use the Linux Runner to run the built model. The underlining Python code is open source. See the examples here.

Hi.i’ve done all the steps you told me to including installing the .eim file from my project in the studio, python SDK, edge impulse cli. But now I am clueless. I wish to use esp32 cams as ip cams(web server) and stream their output to my beagle where the object detection would take place. Could you guide me on how I could use this .eim to do object detection on my beagle through esp 32 cams?

Or if this isn’t possible maybe we could use the C++ library?

I assumed you already had the ESP32-CAM x-fering images to the Beagle. How to do this is outside of the scope of the Edge Impulse platform. Also I believe for embedded devices it is best to capture training data with the embedded device sensors and not use samples captured by other devices. Although, other images can be mixed in to help prevent the model from over-fitting.

Never-the-less my 2 cents of advice is:

If you made an EIM I’ll assume you used a generic image dataset already available.

Now you want to feed the EIM running on the Beagle with images captured from an ESP32-CAM. To implement a solution I see 2 options:

  • Save image to ESP32-CAM SD card

    • Beagle then requests file from ESP32 and data is x-fered to Beagle over GPIO or maybe USB.
    • Beagle saves bits in a buffer[].
    • Feed buffer to EIM via the Edge Impulse Linux Runner.
    • This code shows how to save the ESP framebuffer to a file.
  • X-fer data via webserver

    • Program the ESP32-CAM to post images to a webserver the ESP32-CAM hosts. See this.
    • On Beagle:
      • Program an HTTP client that reads the ESP32-CAM webserver.
      • Download JPG
      • Feed JPG to EIM via the Edge Impulse Linux Runner.

@zappy383 This might help you…it shows how to communicate between an Uno and an ESP.