Running EdgeImpulse on Kria KV260

I am following the post Linux_x86_64 which runs the EdgeImpulse Model on Ubuntu. My question is, If I install Ubuntu on Kria KV260 then will the same instructions work on it?

Hello @timothy.malche,

I am not familiar with the Kria KV260 but I just had a look at the Zynq® UltraScale+™ MPSoC datasheet:

This family of products integrates a feature-rich 64-bit quad-core or dual-core Arm® Cortex®-A53 and dual-core Arm Cortex-R5F based processing system (PS) and Xilinx programmable logic (PL) UltraScale architecture in a single device.

I can confirm that it will not work with the x86_64 installation method, the SoC architecture is different.

Regards,

Louis

ok @louis so is there the other way of running EI on the Kit?

Hello @timothy.malche,

Sure, as long as you can compile C++, you’ll be able to run your impulse on any board. It won’t be hardware-optimized but it’ll work, you can have a look at this documentation: https://docs.edgeimpulse.com/docs/deploy-your-model-as-a-c-library

Regards,

Louis

Hi Louis, Related question as I’m trying to build an Edge Impulse object detection model for the Kria KV260 as well. I exported the C++ model and I’m following the tutorial documentation. It says to copy the raw features from one of the images and “paste the list of raw feature values into the input_buf array.” However, this raw feature list is huge! (ie ~ 1MB in a txt file). Is there a better way to do this? This would just be for a static image but assuming it would be reading in the raw features in real-time during live classification? Thanks!

Hi @jlutzwpi,

Copying in the raw features to input_buf is a super simple example that does not scale well to larger samples. The best way to do this is probably use the callback function (e.g. signal.get_data) to page in parts of your sample at a time. You can have the sample reside in flash memory (e.g. as a static buffer in a .h file) or be read from a file.

There’s an example here that reads in a sample from a file. However, it uses a std::vector to store the sample, which may or may not work for your case.

Very helpful thank you!

Hi @louis, @jenny as per the instruction in the tutorial I can access EdgeImpulse model on KV260, but while selecting running object detection model, it is giving message “no camera detected” although I have tried both the Raspberry Pi camera module as well as IAS Camera module. Is there anything I am missing?

Are you referencing the correct port? (should be /dev/video0 or some other number). I’ve had luck with a USB camera on video0 and the MIPI camera.

hi @jlutzwpi how to reference correct port? I have the MIPI Camera. I have even tried on Raspberry PI Model B but same problem is there, it does not find camera although I have enabled it from config.

Which executable code are you using? The example demos I’ve used as for the camera port as an input parameter to the executable (ie 0 for the camera allocated to /dev/video0).

not using any code. Just running command edge-impulse-linux and it does not detect MIPI camera? How to resolve it?