pythondeep-learninglstmtraining-data

ValueError: Expected input data to be non-empty


This is my code:

x_test = []
y_test = dataset[training_data_len:, :]
for i in range (60,len(test_data)):
    x_test.append(test_data[i - 60 :i, 0])


x_test = np.array(x_test)

x_test = np.reshape(x_test , (x_test.shape[0] ,x_test.shape[1],1))

when I wrote this line:

predictions = model.predict(x_test)
predictions = scaler.inverse_transform(predictions)

I got this Error:

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
Cell In[26], line 1
----> 1 predictions = model.predict(x_test)
      2 predictions = scaler.inverse_transform(predictions)

File c:\Users\hemic\AppData\Local\Programs\Python\Python311\Lib\site-packages\keras\src\utils\traceback_utils.py:70, in filter_traceback.<locals>.error_handler(*args, **kwargs)
     67     filtered_tb = _process_traceback_frames(e.__traceback__)
     68     # To get the full stack trace, call:
     69     # `tf.debugging.disable_traceback_filtering()`
---> 70     raise e.with_traceback(filtered_tb) from None
     71 finally:
     72     del filtered_tb

File c:\Users\hemic\AppData\Local\Programs\Python\Python311\Lib\site-packages\keras\src\engine\data_adapter.py:1319, in DataHandler.__init__(self, x, y, sample_weight, batch_size, steps_per_epoch, initial_epoch, epochs, shuffle, class_weight, max_queue_size, workers, use_multiprocessing, model, steps_per_execution, distribute, pss_evaluation_shards)
   1314 self._configure_dataset_and_inferred_steps(
   1315     strategy, x, steps_per_epoch, class_weight, distribute
   1316 )
   1318 if self._inferred_steps == 0:
-> 1319     raise ValueError("Expected input data to be non-empty.")

ValueError: Expected input data to be non-empty.

How would I go about fixing this error?


Solution

  • Where is test_data defined?

    Is its length more than 60?

    If not, the code does not enter the loop and x_test remains empty.