SELU activation in TfLite micro

This request may belong more to the SIG Micro group, but here goes anyway.

Since BatchNorm is not supported by TF lite micro, it seems using SeLu activation with Le_cun kernel initializer and normalisation of the input has a similar normalising effect, as suggested by Klaumbauer et.al.

Implementing SELU in the TF Lite micro branch used by edgeimpulse may provide a shortcut to BatchNorm like performance. And may @dansitu ? be easier than BN implementation.
I’m a noob to TF custom ops so none of the above are easy to me - hence a feature request ;-).

btw. NN traning in expert mode acts in the must interesting non-deterministic ways when crunching through training sets trying to use activation="selu",kernel_initializer="lecun_normal"

Thanks @opprud, this is an interesting idea! Sorry for the delay in responding, I was on vacation last week :slight_smile:

I’d definitely recommend engaging SIG Micro if you’re interested in adding new kernels, since in my experience there are usually other folks interested in collaborating—or already doing the same work. We try to make our own contributions via the same pathway.

If you did decide to submit an RFC, we’d be happy to look over it and provide our agreement that it would be a helpful addition.

Warmly,
Dan

I think there’s some issues with the BatchNormalization layer that caused hard faults on some systems, which is why we removed it from the UI, but give it a shot.

I’we invited you both to edit the RFC draft https://github.com/opprud/tfLiteRFC, hope you will add comments and perhaps authors before i submit it if you agree it could benfit TF lite micro, and additionally EI users :wink:

It’s here now https://github.com/tensorflow/tensorflow/issues/47424 as recommended by Pete W. If you want to support it being adopted please drop a comment or two :wink: