Feature Request: export TensorflowJS

Could Edge Impulse please fully support all versions of Tensorflow specifically by adding dashboard exported support for TensorflowJs?

Would show a zipped model.json with binary shard files, preferably float32 and also with int8 quantization.

Presently Edge Impulse supports:

Tensorflow: Python

TensorflowLite: Java

TensorflowMicro: C++

but no direct download for Javascript, without potentially loading a python script to correctly do the conversion.

Here are my attempts using the command line converter showing from TFJS to C++ with quantization.


tensorflowjs_converter --input_format=tfjs_layers_model --output_format=keras_saved_model ./model.json ./
# all default to using the above output ./saved_model.pb
tflite_convert --keras_model_file ./ --output_file ./model.tflite
xxd -i model.tflite model.h
tflite_convert --saved_model_dir=./ --inference_type=QUANTIZED_UINT8 --inference_output_type=tf.uint8 --mean_value=128 --std_value=127 --output_file=./model_Q_UINT8.tflite
xxd -i model_Q_UINT8.tflite model_Q_UINT8.h
tflite_convert --saved_model_dir=./ --inference_type=tf.uint8 --inference_output_type=tf.uint8 --mean_value=128 --std_value=127 --output_file=./model_tf_UINT8.tflite
xxd -i model_tf_UINT8.tflite model_tf_UINT8.h
tflite_convert --saved_model_dir=./ --inference_type=tf.int8 --inference_output_type=tf.int8 --mean_value=128 --std_value=127 --output_file=./model_tf_INT8.tflite
xxd -i model_tf_INT8.tflite model_tf_INT8.h


Many people begin programming using javascript, so this might be another entry to Edge Impulse for many Web Developers.

Please like this post as I am having a friendly competition with @dansitu


Hi @Rocksetta so the tflite model alone does not do much (except for images, and not even then as you need to convert color space) as many things are dependent on preprocessing and DSP code, so I’d like to keep the WebAssembly output as the prefered way for deploying to the web.

1 Like

Thanks for the reply @janjongboom . I guess I will move on to other projects.

I was working on ways to drum up support for this. :laughing:

1 Like

@janjongboom @dansitu

I am not good at giving up.

On the offchance that I could convert a Tensorflow saved model to a TensorflowJS Graph Model and got webCam with a canvas working to do the analysis.

I think the prediction inputs would be different than how edge does it.
Any suggestions, because I think this is close but not perfect


const image = await tf.browser.fromPixels(document.getElementById(‘my224x224CanvasA’)).toFloat().reshape([1, 224, 224, 3]) ;


For images this looks roughly correct! If you’re using a TensorFlow SavedModel exported from Edge Impulse it will expect an input between 0 and 1 for each pixel.

1 Like

Thanks @dansitu

I did manage to run processed data from edge impulse as a 1d tensor reshaped to -1,224,224,3 and got the correct classification

But still not getting webCam canvas data correctly formatted.

My website about it is at


Your last statement is probably the issue that I have not normalized RGB values from 0-255 to between 0-1

I will checkout how to do that.

Holy smoke, @dansitu it worked. Thanks for the hints.

     const myImageTensor = tf.browser.fromPixels(document.getElementById('my224x224Canvas')).toFloat().reshape([-1, 224, 224, 3]).div(tf.scalar(255)) ;

Excellent, I’m glad to hear that it worked!

Is the dashboard saved Tensorflow export pre-EON compiler?

Could an EON compiler model even be saved as a Tensorflow Graph model which when converted to TensorflowJs becomes a frozen model, what would be better is a tensorflowjs layers model which individual layers can be retrained. I know Edge Impulse presently is not going that route, so the question is, could it even be done, or is the EON compiler model so different that it couldn’t even be converted to
a TensorflowJs layers model?

Hey, so all EON does is it takes a tflite file and spits out C++ source code - so the underlying model remains the same. Eon thus doesn’t make sense for tensorflow.js you’d just take the tensorflow model and convert it something suitable for tf.js.

For webassembly we can use eon as there we do tflite -> c++ -> webassembly but it just saves space.

Thanks so much @janjongboom, I thought EON was doing something with the TFLite model. That is really cool.

I am glad I have both the WebAssembly output and the Tensorflow Saved model output working in the web browser my demo menu here. That gives me much more flexibility and now I know when I can use each method:

I can use the Tensorflow Saved model converted to TFJS when I want to bridge students from TensorflowJS to Edge Impulse. (Model making in Edge Impulse is a joy, compared to in the browser )

Here is a laughable way that I used to train a TFJS program in the browser.

I can use the WebAssembly method when my students just need the best speed and the smallest memory size to demonstrate their Machine Learning models, before trying to get them working on the Arduino Portenta.

Thanks for replying.