I am using tf.keras.preprocessing.image_dataset_from_directory to load my large dataset. the problem is the training phase is so slow when I exploit this method in
fit_generator() although I use the google Colab GPU.
The code is:
image_size = (224, 224) batch_size = 32 data = tf.keras.preprocessing.image_dataset_from_directory( '/content/drive/My Drive/dataScience/september exam/data/trainImg', seed=1337, image_size=image_size, batch_size=batch_size, )
for the training:
model.fit_generator(train_dataset, epochs=50, verbose=1)
You can try to reduce the image shape to 128×128, reduce the batch_size and use the GPU of the Collab, you should use
model.fit() .Hope this helps you with time optimization.