Help needed with running an inference on the Arduino Nano RP2040
Help needed with running an inference on the Arduino Nano RP2040.
Hey @aurel and @MMarcial thank for answering my previous question. You were right i needed to update the libraries. However i find my self with a new challenge. I followed this video Deploy an ML Model to Any Target with the Edge Impulse C++ Library by @ShawnHymel to understand how to deploy my already built sound classification model in edge impulse. I should note that i am not deploying to any target device but to my new Arduino Nano RP2040 board. However after adding the zipped folder downloaded from edge impulse into my Arduino studio and uploaded the code safely to the board and use the edge-impulse-run-impulse --raw command to run the model locally. It only runs the first function as shown in the picture below. I am unable to get inference from the microphone which i would like to do? I know you are super busy, but i will appreciate your help. Thank you. Here is the screenshot of my terminal’s output.
The above screen shot is got after running the nano_ble33_sense_microphone sketch. On following this example Arduino library - Edge Impulse Documentation and using the static_buffer sketch and run the edge-impulse-run-impulse command, i get the following output. The output matches the classification results from studio. Thank you.
However the cursor prompter is in a waiting state. so i do not know whether i should go a head and define the time period necessary to listen for the sound input. Secondly when i run the edge-impulse-run-impulse --continuous This is what i get. For some reason there is a failure to recognize the edge impulse binary built files. Thank you.
Are you trying to use the Arduino Nano 33 BLE Sense sketch on the Arduino Nano RP2040?
They don’t share the same configuration so running the Arduino Nano 33 BLE sketches on the Arduino Nano RP2040 won’t work.
Also, to answer your questions about the edge-impulse-run-impulse --raw, it just open a serial console.
The program expects AT commands: Serial protocol - Edge Impulse API
You can use AT+RUNIMPULSE or AT+RUNIMPULSEDEBUG.
I notice that our Serial Protocol doc page needs to be slightly updated. We’ll be working on that asap.
Thank you Louis, You are right on this one .I was trying to use the Arduino Nano 33 BLE Sense sketch on the Arduino Nano RP2040. At the back of my mind i knew i was wrong but was fooled by a few working examples i had got, so i decided to continue testing the waters. When do you think the edge impulse team will release the sketches for the Arduino Nano RP2040? I know they have worked on the firmware binary files already and i used to connect the board to edge impulse studio. Can one connect the board using a WiFi-connection?. And in the mean time do you recommend i use the C++ compiler library in edge studio under deployment or the Arduino Library for Arm-based Arduino development boards. Because the examples above are from the Arduino Library. I need your help deploying my sound classification model on my Arduino Nano RP2040. Finally in regards to what you said above that the, edge-impulse-run-impulse --raw command just opens a serial console is true. I tested it by opening the serial monitor in Arduino Studio, and i have the same result as in the Linux terminal. Unfortunately i am unable to run the commands you sent to me. You see the screen shot below.
So from which target device i am i running the static_buffer sketch,Is there a hidden mechanism that edge impulse uses to connect to Arduino studio and to run the static_buffer sketch?. Because from the terminal i can see that i am connected to the serial terminal via the /dev/ttyACM0? Do i need to specify the baud-rate or with screen?Here is a screen shot of what i am talking about. Thank you. Micheal
If you compile the static buffer and you selected the Arduino Nano RP2040 on the Arduino IDE, as there is no sensors involved, it’s just standalone cpp compilation. Thus that’s why it’s working.
Note that you can implement your own sensor sampling strategy to feed the buffer.
And I’ve been discussing today with our embedded team the possibility to support the Arduino Nano RP2040 through the Arduino sketches examples. I don’t have an answer yet but it’s on their topics to discuss and prioritize (or reject or postpone ) .
Thank you Louis, I have implemented my own sampling strategy by copying the raw features from edge impulse studio into the static_buffer sketch, so far the results are similar to those in the edge impulse studio. So why then is the terminal saying it is connected to /dev/ttyACM0 port is this triggered by default? As shown in the screen shot above. Lastly please kindly prioritize because I am a volunteer mentor in the Arm Engage Challenge working currently currently over 200 upcoming professional engineers(first cohort) from over 7 African countries, we are looking at micro-controllers,TinyML,IoT and definitely edge impulse for building our models and finally deploying them on the boards. And the boards that have been donated by Arm so far using are the Arduino Nano RP2040. I have covered the basics questions from the learners as you can see from these articles.. I need help completing this project so that others might as well and we can then quickly build on top of these foundations. So my on behalf and also on the behalf of everyone,kindly consider or our case. What are your thoughts on deployment using C++ Library in the mean time? I can spare hours tomorrow and investigate this further so that i can also properly document it.
If you have several serial port connected, it will ask you to choose e.g where I have several arduino boards connected to my laptop:
luisomoreau@Louis-C02FD1LEMD6T ~ % edge-impulse-run-impulse
Edge Impulse impulse runner v1.16.3
? Which device do you want to connect to? (Use arrow keys)
/dev/tty.usbmodem101NTJJK02442 (LG Electronics Inc.)
❯ /dev/tty.usbmodem142201 (Arduino)
That’s great! I can see several familiar names in the mentors too (@oduor_c, @pete, Robert John).
Sure, either that or implement your own sensor sampling strategy and provide an example in a GH repository. That’s usually what I do when I teach IoT, Embedded Programming or custom Embedded ML programs.
Thank you for time to reply to my previous posting. I have a couple of things. 1. Kindly explain more about how to implement your own sensor sampling strategy? I am wondering if i have to stop using the edge impulse software?