'dict' object has no attribute 'shape'

My Neural Network gives this error while training it.

Creating job… OK (ID: 943051)

Scheduling job in cluster…
Job started
Splitting data into training and validation sets…
Splitting data into training and validation sets OK
Traceback (most recent call last):
File “/home/train.py”, line 266, in
main_function()
File “/home/train.py”, line 128, in main_function
SPECIFIC_INPUT_SHAPE)
File “./resources/libraries/ei_tensorflow/training.py”, line 190, in get_datasets
train_dataset = get_dataset_standard(X_train, Y_train)
File “./resources/libraries/ei_tensorflow/training.py”, line 118, in get_dataset_standard
output_shapes=(tf.TensorShape(X_values[0].shape), tf.TensorShape(Y_values[0].shape)))
AttributeError: ‘dict’ object has no attribute ‘shape’

Application exited with code 1 (Error)

Job failed (see above)

@dansitu Looks like a bug. Any ideas for a workaround?

Just a comment, probably trivial:

X_values[0] or Y_values[0].shape or both are the problem.
The should be numpy array (or similar data structure in Tensorflow), but they are actually dict (which do not have a shape method)

Hi @mikepuzzo,

Looks like you were running an object detection project. Have you fixed the issue switching to the Object detection block?

Aurelien

1 Like

Hi, I’m also facing the same issue. I’m using the simple visual editor and I’ve added my data and want to run a simple NN model but I keep saying this issue related to : ‘dict’ object has no attribute ‘shape’.

output_shapes=(tf.TensorShape(X_values[0].shape), tf.TensorShape(Y_values[0].shape)))

AttributeError: ‘dict’ object has no attribute ‘shape’

Application exited with code 1 (Error)

I’m switching to check the python notebook version now but I’m not even sure if I fix the issue, how do I import it back into EdgeImpulse for model compression. Any help would be really helpful. Thank you.

Hello @anupamguptacal,

Under the Create Impulse view, can you try to add an Object Detection learning block instead of NN Classifier? This should solve your issue:

Regards,

Louis