When the tf.keras.utils.get_file line is done with downloading or if the file is already there, my script just freezes like using sleep() but infinitely. I'm pretty new to tensorflow and python, so I have no idea why this would happen.
I am using tensorflow 2.16.1 and python 3.12.4 if that helps.
import matplotlib.pyplot as plt
import os
import re
import shutil
import string
import tensorflow as tf
from tensorflow.keras import layers # type: ignore
from tensorflow.keras import losses # type: ignore
url = "https://ai.stanford.edu/~amaas/data/sentiment/aclImdb_v1.tar.gz"
dataset = tf.keras.utils.get_file("aclImdb_v1", url,
untar=True, cache_dir='.',
cache_subdir='')
print("test")
dataset_dir = os.path.join(os.path.dirname(dataset), 'aclImdb')
os.listdir(dataset_dir)
train_dir = os.path.join(dataset_dir, 'train')
os.listdir(train_dir)
sample_file = os.path.join(train_dir, 'pos/1181_9.txt')
with open(sample_file) as f:
print(f.read())
I have realized that nothing in this code is wrong it just takes a very long time to extract all the files from the tar file that the get_file() function downloads. But i still thank you guys for trying to help.