javascripttensorflowkerastfjs-nodetensorflow-layers

Loading a keras model into tfjs causes input shape mismatch


I made a model in Keras and saved it as a tfjs model. I imported it to my js project successfully. I am, however, unable to use the model in my JS code because of an input shape error. Keras model:

# Python
model = keras.Sequential([keras.layers.Dense(8,
    activation='sigmoid', input_dim=4),
keras.layers.Dense(3, activation='softmax')])
model.compile(optimizer='adam',loss='categorical_crossentropy', 
                  metrics=['accuracy'])
    model.fit(dx, dy, epochs=100, batch_size = 5)

It works as expected when I use it in the python file like this, for example:

# Python
model.predict_classes([[5.1, 3.5, 1.4, 0.2]]) # This works

However when I try the same in JS with

// JS
sv = tf.tensor([s1v, s2v, s3v, s4v]) //s1v to s4v are floats
var pred = await model.predict(sv); // This gives an error

Gives this error:

tfjs@latest:17 Uncaught (in promise) Error: Error when checking : expected dense_13_input to have shape [null,4] but got array with shape [4,1].

I keep getting the same error with:

  1. Declaring s1v...s4v as tf.scalars individually and stacking them with tf.stack()
  2. Declaring s1v...s4v as tf.tensor1d individually and stacking them with tf.stack()
  3. Declaring s1...s4v in a single tf.tensor1d

Please help me fix this...


Solution

  • There are two fixes to this issue that I know of