Using custom learning rate schedulers

Hi
great product you have here!

I’m trying to add a custom learning rate scheduler using the “advanced” mode for a NN block.

Because of a bug in TF2, this is not possible. I get the following error after training is complete:
“ValueError: Unknown decay: DoubleCosineDecay!”

I believe that after training the NN, you try to read back the “best” epoch using something like:
model = tf.keras.models.load_model('saved-models/model112')

if you changed that to
model = tf.keras.models.load_model('saved-models/model112', **compile=False**) **model.compile()**

Then that should work fine. (I get the problem all the time when doing Jupyter notebooks with TF, and this is the currently accepted work-around)

Hope you can help on this
Cheers
N

Hi @nigelcroft,

Thanks for your feedback!
You’re right about reading back the best epoch. One quick way to bypass it is to remove the callbacks=callbacks in the model.fit() function, this will not load the best model though.
Maybe @dansitu has better way to handle this case (overwriting the load_model function?).

Aurelien

Hi @nigelcroft, thank you for the bug report! Unfortunately there’s no good way to hack around it in the advanced editor, but we’ll work on a fix for this issue ASAP.

Thanks guys! Good to know you’re on it :smile:

This should hopefully be fixed now—give it a try and let me know how it goes!

Perfect - works like a charm. Many thanks for your help on this!

1 Like