Error when uploading model with BYOM

Question/Issue:
I want to import my own model into Edge Impulse, but get the error:

Job started
Converting SavedModel…
Scheduling job in cluster…
Container image pulled!
Job started
INFO: No representative features passed in, won’t quantize this model

Extracting saved model…
Extracting saved model OK

–saved-model /tmp/saved_model does not exist
Application exited with code 1

Converting SavedModel failed, see above
Job failed (see above)7

Could you help me understand why this is happening? I trained my model and saved it in TensorFlow SavedModel format, then compressed it in a zip file.

Project ID:
385872

Hello @marion_p369,

Sorry for the late answer.

You can have a look at the troubleshooting section in the BYOM documentation page: Bring your own model (BYOM) | Documentation

This is a known error, make sure to upload a .zip archive containing at minimum a saved_model directory that contains your saved_model.pb .

Best,

Louis

Hello @louis,

what is the problem when you get this logs:

Converting SavedModel...
Scheduling job in cluster...
Container image pulled!
Job started
INFO: No representative features passed in, won't quantize this model

Extracting saved model...
Extracting saved model OK

Converting to TensorFlow Lite...
WARN: Failed to convert to TensorFlow Lite: SavedModel file does not exist at: /tmp/extracted_sm/{saved_model.pbtxt|saved_model.pb}

Application exited with code 1

Converting SavedModel failed, see above
Job failed (see above)

Thank you for your help!

Hello @pj_minervaas,

This seems like the same error as above, see this troubleshooting section

What is the structure of your folder in your .zip archive?

Best,

Louis

Hi @louis,

it worked after I renamed the ZIP file to “saved_model”. But then I got more errors. I think those are from my tensorflow model.

Creating job... OK (ID: 24689120)

Scheduling job in cluster...
Container image pulled!
Job started
Converting SavedModel...
Scheduling job in cluster...
Container image pulled!
Job started
INFO: No representative features passed in, won't quantize this model

Extracting saved model...
Extracting saved model OK

Converting to TensorFlow Lite...
WARN: Failed to convert to TensorFlow Lite: Op type not registered 'SimpleMLCreateModelResource' in binary running on job-project-532838-24689126-dqu3g-fbrp5. Make sure the Op and Kernel are registered in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib, accessing (e.g.) `tf.contrib.resampler` should be done before importing the graph, as contrib ops are lazily registered when the module is first accessed.
You may be trying to load on a different device from the computational device. Consider setting the `experimental_io_device` option in `tf.saved_model.LoadOptions` to the io_device such as '/job:localhost'.

Application exited with code 1

Converting SavedModel failed, see above
Job failed (see above)

Thank you for your fast support!

Best regards!

Hello @pj_minervaas,

Indeed, it seems that you’re using an op that is not supported.
I am not familiar with that op, I had a quick look and it seems to come from tensorflow-decision-forest (TF-DF) correct?

Best,

Louis

Exactly, the model is from tensorflow-decision-forest. I am currently trying to figure out a way to deploy this model for C/C++. But I just learned that tflite is not yet supporting this library.

So I am looking for another way to make it work.

Best regards!