I have a model that would take a normal array and process it within a GAN, it worked but once I changed it to be multi-imput, I started to get:
ValueError: Graph disconnected:
My original code:
# Build stacked GAN model
gan_input = Input(shape=Xtrain.shape[1])
H = generator(gan_input)
gd_input=Concatenate()([gan_input,H])
gan_V = discriminator(gd_input)
GAN = Model(gan_input, [gan_V,H])
GAN.compile(loss=['categorical_crossentropy','mse'], optimizer=opt) #Complete GAN have both loss functions
GAN.summary()
then I modified it for multi-input:
gan_dataframe_input = Input(shape=Xtrain[1][:-2].shape) #new testing
numpy_input = Input(shape=Xtrain[1][-1].shape)
gan_input = layers.concatenate([gan_dataframe_input, numpy_input])
print(gan_input)
print(mergedLayer)
H = generator([gan_dataframe_input,numpy_input]) <<--two shapes being imputed
gd_input=Concatenate()([gan_input,H]) <<--merged layer + above two shapes being imputed
gan_V = discriminator(gd_input)
GAN = Model(gan_input, [gan_V,H]) <<--this line returns an error
GAN.compile(loss=['categorical_crossentropy','mse'], optimizer=opt) #Complete GAN have both loss functions
GAN.summary()
Stack trace:
KerasTensor(type_spec=TensorSpec(shape=(None, 736), dtype=tf.float32, name=None), name='concatenate_28/concat:0', description="created by layer 'concatenate_28'")
KerasTensor(type_spec=TensorSpec(shape=(None, 736), dtype=tf.float32, name=None), name='concatenate_27/concat:0', description="created by layer 'concatenate_27'")
WARNING:tensorflow:Functional model inputs must come from `tf.keras.Input` (thus holding past layer metadata), they cannot be the output of a previous non-Input layer. Here, a tensor specified as input to "model_34" was not an Input tensor, it was generated by layer concatenate_28.
Note that input tensors are instantiated via `tensor = tf.keras.Input(shape)`.
The tensor that caused the issue was: concatenate_28/concat:0
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
<ipython-input-94-ac83091846e6> in <module>()
69 gd_input=Concatenate()([gan_input,H])
70 gan_V = discriminator(gd_input)
---> 71 GAN = Model(gan_input, [gan_V,H])
72 GAN.compile(loss=['categorical_crossentropy','mse'], optimizer=opt) #Complete GAN have both loss functions
73 GAN.summary()
4 frames
/usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/engine/functional.py in _map_graph_network(inputs, outputs)
988 'The following previous layers '
989 'were accessed without issue: ' +
--> 990 str(layers_with_complete_input))
991 for x in nest.flatten(node.outputs):
992 computable_tensors.add(id(x))
ValueError: Graph disconnected: cannot obtain value for tensor KerasTensor(type_spec=TensorSpec(shape=(None, 659), dtype=tf.float32, name='input_71'), name='input_71', description="created by layer 'input_71'") at layer "concatenate_28". The following previous layers were accessed without issue: []
Oddly, looking at the full track trace, after I printed data on the layers, it seems the the number of items in the array aren't aligned? (659,) is the size of one of the inputs, whereas the other is (77,). What am I doing wrong here?
When you build multi-input/multi-output models, you must compile and feed the model input and output as arrays, instead of concatenating them like you did. Moreover, the inputs of a model must always be tf.keras.layers.Input
. So the correct code would be
gan_dataframe_input = Input(shape=Xtrain[1][:-2].shape) #new testing
numpy_input = Input(shape=Xtrain[1][-1].shape)
gan_input = layers.concatenate([gan_dataframe_input, numpy_input])
print(gan_input)
print(mergedLayer)
H = generator([gan_dataframe_input,numpy_input]) <<--two shapes being imputed
gd_input=Concatenate()([gan_input,H]) <<--merged layer + above two shapes being imputed
gan_V = discriminator(gd_input)
GAN = Model([gan_dataframe_input, numpy_input ], [gan_V,H]) <<--this line is modified
GAN.compile(loss=['categorical_crossentropy','mse'], optimizer=opt) #Complete GAN have both loss functions
GAN.summary()