Camera feed on Pi desktop?

Hi, is there any possibility to display the camera feed localy on the Raspberry Pi desktop? We would like to use the trained model on our Raspberry Pi without internet connection.

Thanks in advance

Hello @HTWG_Elite,

when you run this command:

edge-impulse-linux-runner --clean

Edge Impulse Linux runner v1.2.6

? **What is your user name or e-mail address (edgeimpulse.com)?** louis-demo

? **What is your password?** *[hidden]*

? **From which project do you want to load the model?** Louis (Demo) / Person Detection
[RUN] Already have model /Users/luisomoreau/.ei-linux-runner/models/40479/v19/model.eim not downloading...
[RUN] Starting the image classifier for Louis (Demo) / Person Detection (v19)
[RUN] Parameters image size 320x320 px (3 channels) classes [ 'person' ]

? **Select a camera** FaceTime HD Camera (Built-in)
[RUN] Using camera FaceTime HD Camera (Built-in) starting...
[RUN] Connected to camera

Want to see a feed of the camera and live classification in your browser? Go to http://192.168.1.196:4912

boundingBoxes 45ms. []
boundingBoxes 29ms. [{"height":160,"label":"person","value":0.8110049366950989,"width":229,"x":65,"y":151}]
boundingBoxes 28ms. [{"height":160,"label":"person","value":0.789042592048645,"width":231,"x":73,"y":149}]
boundingBoxes 26ms. [{"height":161,"label":"person","value":0.8051104545593262,"width":228,"x":75,"y":149}]

...

There is one line indicating a local url for you to check the camera feed:

Regards,

Louis

1 Like

@HTWG_Elite Also the Python examples have examples to show the camera feed using OpenCV.

Thanks for the fast reply. The way @Louis describes how to start the object detection works fine, but is it also somehow possible to start the model.eim manualy when the Pi isn’t connected to Edge Impulse? Once started the model works without internet connection, but I can’t start it without…

:v:

Hey @HTWG_Elite,

If you run the helper, you should see the following output:

edge-impulse-linux-runner --help
Usage: edge-impulse-linux-runner [options]

Edge Impulse Linux runner 1.2.6

Options:
  -V, --version        output the version number
  --model-file <file>  Specify model file, if not provided the model will be fetched from Edge Impulse
  --api-key <key>      API key to authenticate with Edge Impulse (overrides current credentials)
  --download <file>    Just download the model and store it on the file system
  --clean              Clear credentials
  --silent             Run in silent mode, don't prompt for credentials
  --quantized          Download int8 quantized neural networks, rather than the float32 neural networks. These might run faster on some architectures, but
                       have reduced accuracy.
  --enable-camera      Always enable the camera. This flag needs to be used to get data from the microphone on some USB webcams.
  --dev                List development servers, alternatively you can use the EI_HOST environmental variable to specify the Edge Impulse instance.
  --verbose            Enable debug logs
  -h, --help           output usage information

First download the model using the --download argument
And then you can pass the model using the --model-file argument :

edge-impulse-linux-runner --model-file my-custom-model.eim

Regards,

Louis

1 Like

Thanks for your help, it worked :+1: