Keras(export) mode error

Hi,
I’m trying to train my model in keras(export mode) in “google colab.”


In this section.

I’ve got the error message.

Do I have to download the .h file?

Hi @SunBeenMoon,

You’re trying to load a weights’ file which does not exist in the Studio.
One way to load pretrained weights is to convert your h5 file to base64 and load it in our Expert mode.
See an example here: Retraining a pretrained model - transfer learning

Aurelien

Thank you for your reply! @aurel

Actually, I downloaded the .ipynb file and didn’t modify anything.
This is the same file that was first downloaded from “Switch to Keras export mode” in transfer learning.

Then, can I understand that I have to download Mobilenetv2 weight separately and put it in a .h5 file?

Or can I just use the downloaded file because the API is associated with edge impulse? When used as it is, an error occurred like the previous picture.

That’s a good point, I’ll check with the team to export the h5 as part of the iPython.
In the meantime I’ll DM you the weights file.

Aurelien

2 Likes

Thank you!! @aurel :blush:

Hi @aurel.

I’ve got another error message.

Section picture.

Error message.

Do I have to download another file?

Regards,

Hi @SunBeenMoon,

We are working on fixing those issues when exporting as a Jupyter notebook.
In the meantime you can modify some of the code in your notebook to make it work:

  • Use the set_batch_size() function instead of ei_tensorflow.training.set_batchsize(). It is previously defined in the notebook

  • Remove the fine-tuning portion in the notebook (just after: print('Initial training done.', flush=True)

Aurelien

Hi @aurel

Thank you for your kind reply :blush:

I’m not the expert of this ML/DL part. I’m embedded engineer. So please understand if my question is one-dimensional.

I’ve got another issue.

Error part

Error log

I think I have to change code of ‘train_dataset = train_dataset.map(reshape, tf.data.experimental.AUTOTUNE)’ AUTOTUNE part.
Can you suggest me advice?

My goal is draw a graph to check the model is overfit or not.

Regards,

There are some functions to change actually:

def set_batch_size(batch_size, train_dataset, validation_dataset):
    train_dataset = train_dataset.batch(batch_size, drop_remainder=False)
    validation_dataset = validation_dataset.batch(batch_size, drop_remainder=False)
    return train_dataset, validation_dataset

and:

# Set the data to the expected input shape
def reshape(image, label):
    return tf.reshape(image, (-1, INPUT_SHAPE[0], INPUT_SHAPE[1], INPUT_SHAPE[2])), label

I’ll send you my complete iPython notebook in DM.

Aurelien

1 Like

Thank you so much @aurel

I was very moved by your kind reply!! :blush:

1 Like