Model using LSTM not deployable

The LSTM (UNIDIRECTIONAL_SEQUENCE_LSTM) operation is now supported by the TFLite Micro. I am able to train a model using the LSTM layer but the deployment is not working with EON. Without EON enabled, I am able to build and deploy to hardware but it does not work.

Compiling EON model...
Could not set up compiler
Didn't find op for builtin opcode 'UNIDIRECTIONAL_SEQUENCE_LSTM' version '1'. This model is not supported by EON Compiler of TensorFlow Lite Micro, but is in full TFLite (e.g. on Linux).

Failed to get registration from op code UNIDIRECTIONAL_SEQUENCE_LSTM
 
Failed starting model allocation.

I guess the deployment bundle is not using the latest TFLite Micro SDK. Please update the SDK or let me know how can I solve it locally?

Hi @naveen we’re currently in progress on bringing the latest TFLM kernels into Edge Impulse (@AIWintermuteAI is working on it). However, this takes a while as we’re having thousands of models, 50+ boards over 20 different architectures that all need to be verified as still working (we have gotten very good at finding bugs in TF, CMSIS-NN and other vendor libraries :slight_smile: ). This will also enable subgraph support in the EON Compiler which will be useful for RNNs in general.

If you can email me a TFLite file w/ example inputs I’ll take a look and see if we can backport the op into the current SDK (jan@edgeimpulse.com).

1 Like

Hi @janjongboom,

I have added you as a collaborator to a test project (Project ID: 10419) with data that uses a bare-minimum model with one LSTM layer. Please have a look.

Thanks,
Naveen

Hi @naveen, thanks - interesting that there’s now a straight op for LSTMs rather than dealing w/ subgraphs so that’s good - looking at the complexity of this I’ll wait for @AIWintermuteAI’s work to land.

1 Like