I'm following a tutorial on data augmentation by aladdin persson. In the tutorial, some preprocessing modules in tensorflow were still being experimented as at time (3yrs ago) such as RandomFlip, Resizing. So the tutorial used codes like layers.experimental.preprocessing.Resizing("data property"). However, Although the output of the code in the tutorial did not raise any errors 3 years ago, I am now not getting an error free output, these is the errors with traceback that I have been receiving:
Traceback (most recent call last):
File "c:\Users\USER\CODE\PycharmProjects\pythonProject2Conda\main4.py", line 54, in <module>
layers.experimental.preprocessing.Resizing(height=32, width=32),
^^^^^^^^^^^^^^^^^^^
AttributeError: module 'keras._tf_keras.keras.layers' has no attribute 'experimental'
Here is the actual code body, It was supposed to train my neural network on data augmentation from my understanding.
import os
os.environ['TF_CPP_MIN_LOG_LEVEL'] = '2'
import tensorflow as tf
from tensorflow import keras
from tensorflow.keras import layers, regularizers
import tensorflow_datasets as tfds
import pandas as pd
from tensorflow.keras.optimizers.legacy import Adam
#HYPER PARAMETERS
(ds_train, ds_test), ds_info = tfds.load(
"cifar10",
split=["train", "test"],
shuffle_files=True,
as_supervised=True,
with_info=True
)
def normalize_img(image, label):
return tf.cast(image, tf.float32)/255.0, label
AUTOTUNE = tf.data.experimental.AUTOTUNE
BATCH_SIZE = 32
def augment(image, label):
new_height = new_width = 32
image = tf.image.resize(image, (new_height, new_width))
if tf.random.uniform((), minval=0, maxval=1) <0.1:
image = tf.tile(tf.image.rgb_to_grayscale(image), [1,1,3])
image = tf.image.random_brightness(image, max_delta = 0.1)
image = tf.image.random_contrast(image, lower=0.1, upper=0.2)
image = tf.image.random_flip_left_right(image) #50% of cases will be flipped left and right
# image = tf.image.random_flip_up_down(image) 50%
return image, label
ds_train = ds_train.map(normalize_img, num_parallel_calls=AUTOTUNE)
ds_train = ds_train.cache()
ds_train = ds_train.shuffle(ds_info.splits["train"].num_examples)
#ds_train = ds_train.map(augment, num_parallel_calls=AUTOTUNE)
ds_train = ds_train.batch(BATCH_SIZE)
ds_train = ds_train.prefetch(AUTOTUNE)
ds_test = ds_test.map(normalize_img, num_parallel_calls=AUTOTUNE)
de_test = ds_test.batch(BATCH_SIZE)
ds_test = ds_test.prefetch(AUTOTUNE)
data_augmentation = keras.Sequential([
layers.experimental.preprocessing.Resizing(height=32, width=32),
layers.experimental.preprocessing.RandomFlip(mode="horizontal"),
layers.experimental.preprocessing.RandomContrast(factor=0.1),
])
model = keras.Sequential ([
keras.Input((32, 32, 3)),
layers.Conv2D(4, 3, padding="same", activation='relu'),
layers.Conv2D(8, 3, padding="same", ctivation='relu'),
layers.MaxPooling2D(),
layers.Conv2D(16, 3, activation='relu'),
layers.Flatten(),
layers.Dense(64, activation="relu"),
layers.Dense(10),
])
model.compile(
optimizer=keras.optimizers.Adam(3e-4),
loss = keras.losses.SparseCategoricalCrossentropy(from_logits=True),
metrics=["accuracy"],
)
model.fit(ds_train, epochs=5, verbose=2)
model.evaluate(ds_test)
It shouldn't be particularly surprising that "experimental" features from three years ago aren't experimental anymore! Moving new features out of that stage is the goal, after all.
RandomFlip
, Resizing
, and RandomContrast
are now fully-fledged features, and ".experimental.preprocessing" has been removed from their syntax to reflect this.
Replace your data_augmentation
assignment with this:
data_augmentation = keras.Sequential([
layers.Resizing(height=32, width=32),
layers.RandomFlip(mode="horizontal"),
layers.RandomContrast(factor=0.1),
])