Wasm standalone inference targeting wasmtime runtime

Hi

I’m trying to integrate edgeimpulse webassembly models into a wasmtime runtime ( elixir wasmex ).

so far I only get:

wasmtime run ~/Downloads/tutorial_-continuous-motion-recognition-wasm-v59/browser/edge-impulse-standalone.wasm
Error: failed to run main module /Users/adrianibanez/Downloads/tutorial_-continuous-motion-recognition-wasm-v59/browser/edge-impulse-standalone.wasm

Caused by:
0: failed to instantiate “/Users/adrianibanez/Downloads/tutorial_-continuous-motion-recognition-wasm-v59/browser/edge-impulse-standalone.wasm”
1: unknown import: env::abort has not been defined

I found this repo but I’m not really sure how to use it.

Could I get a hint in the right direction?
Has anyone used wasmtime to run inference? Or other runtimes on ios / android?

In the meantime I made some progress. But the wasm interface looks quite lowlevel :woozy_face: Is the output of all webassembly deployments structured in the very same way? Eg. could I do one integration loop and always rely on that interface? Or is wasm + javascript generated dynamically according to the edge impulse project setup?

Any way there could be a higher level abstraction on the wasm file with less emphasis on the javascript layer? I have no plans / desire of enabling javascript in the wasmtime/wasmex/elixir runtime. The target platform are mobiles and smaller “embedded” ( nerves project ) devices.

Could compiling the C++ source code to wasm be a more feasible approach to achieve compatibility with wasmtime? Resp does that make sense at all. Haven’t thought too much about how to deploy edge impulse models onto mobile targets ios / android. But assumed that wasm could be a good strategy with good multiplatform characteristics.

I saw some information about GPU accelerated android edge impulse inference. does that apply to ios too? Will have a look at the available mobile app.

Any recommendations / expertise would be very helpful

1 Like

Have you considered providing a rust version of the model deployments? The ecosystem regarding wasm support seems quite good as far as I understand.
And I’m specifically asking because I already use rust resp. rustler to interface with elixir and provide multiplatform support for things like BLE sensors.
Having rust code would be quite interesting in all those applications. eg. as wasm and integrated in elixir or as native library on different platforms

1 Like

Having looked at the options of compiling to wasm target with C++ I would like to suggest a feature of building a wasi wasm deployment deliverable based on GitHub - WebAssembly/wasi-sdk: WASI-enabled WebAssembly C/C++ toolchain They provide docker builder images.

Would be nice to also have a function exposing the model_parameters so that downstream wasm runtimes like wasmex resp endclients have an entrypoint for selecting / implementing encoders without much friction.

what is your opinion on GitHub - WebAssembly/wasi-nn: Neural Network proposal for WASI? any plans to support that at some point?

Wasm Artifact: https://github.com/adiibanez/wasi_edge_impulse/actions/runs/13912366627/artifacts/2768927220

There still are a number of issues but I can instantiate the wasm in wasmex and at least lookup exports / imports. The run_ functions and also get_model_parameters_json are exported but there is a also a lot of mangled functions. Would be nice to have a little less noise in terms of exports; not sure if that is possible with wasm.

When I try to call get_model_parameters_json I get following exception:

(MatchError) no match of right hand side value: {:error, “unknown import: wasi_snapshot_preview1::args_get has not been defined”}

Maybe someone a bit less “rust-y” with Make / C++ could have a look? I’m also not too familiar with the wasi api. Took me quite a few LLM queries and head scratchers to implement the proof of concept.

As far as I understand the file should run on any of the wasm runtimes. wasmtime, wasmedge, … I tried only wasmtime via wasmex elixir binding.

1 Like

Any chance I could get some feedback from the team regarding wasm wasi, wasi-nn, wasi components, …

Are there any plans or at least the interest to support those APIs in any form at some point in the future?

2 Likes

Hi @adiibi

Oh wow very cool to see this from a community member! @louis @brianmcfadden @davidtischler_edgeim check out this repo!

You are correct Android is the best way to go, and we have started to build some blogs and tutorials around this this is the best place to start: On Android | Edge Impulse Documentation

Let me share your repo with the team, presently we have been working on Android as a WASM alternative for mobile but you raise a good point on Rust, and yes we do have some Rust enthusiasts within our engineering team (@ferjm_ei looking at you :smiley: )

Best

Eoin

Nice work @adiibi , great job on putting that repo together!

1 Like

Hi @Eoin glad you liked it. It doesn’t have much usefulness in it’s current state to be honest. At least the wasi ci pipeline parts might have some purpose in saving someones time. Not sure if I’m the right person to integrate a C / C++ codebase like yours. But it served as a C++ wasi “proof of concept” and I had some hopes to catch someones attention at EI … so thank you very much :wink:

What are the reasons for android as wasm alternative?

I understand that eg. wasi component api might be too volatile in it’s current evolution stage. But targeting multiple platforms with one single binary / api instead of 2 independent integration efforts seems like a good approach to me. Specially if multiple models should play together or run concurrently. In my case running the same binary on the backend and on multiple mobile platforms could be very valuable.

My own motivation for exploring wasm is as an alternative to lengthy and delicate static lib builds ( C++ and rust libs ). In my current implementation all of those libs get mushed into one erlang OTP binary. Very easy to break that build pipeline.

Also supporting apple mobile targets would be imperative in my usecase. EI doesn’t support any apple targets, right? Or have you tried with the C++ library?

Some degree of official deployment target support from EI for a “sane” wasm wasi API and / or rust would be awesome! @ferjm_ei Ojalá! :wink:

Hi @Eoin @davidtischler_edgeim @ferjm_ei

As a “token” of my commitment to building enduser products based on EI and wasi I put in a little more time and “wasm tooling pain” to actually build a wasi component. Also first baby steps into making it actually doing something useful. Not sure yet as I haven’t got the boilerplate code up and running to integrate it.

To be honest I have no plans in maintaining such an EI abstraction without “upstream” integration in EI studio and / or EI sdk.
Does wasm wasi resonate with you guys at all? Is this something that could be supported as a deployment target from your side?

build/edge_impulse.wit

package edge-impulse:ml;

interface model {
get-model-name: func() → string;
get-input-shape: func() → list;
run-inference: func(input: list) → tuple<list, float32>;
}

world edge-impulse {
export model;
}