deeplearning4jdl4jcomputation-graph

dl4j ComputationGraph error: "cannot do forward pass: inputs not set"


I'm trying to pass multiple inputs to a ComputationGraph, but keep getting the error "cannot do forward pass: inputs not set". What is my mistake here?

        ArrayList<String> inAndOutNames = new ArrayList<>();
        String[] inputNames = new String[inputAmount];
        InputType[] inputTypes = new InputType[inputAmount + 1];
        for(int i = 0; i < inputAmount; i++)
        {
            inAndOutNames.add("bit" + i);
            inputNames[i] = "bit" + i;
            inputTypes[i] = InputType.recurrent(1);
        }
        inAndOutNames.add("p");
        inputTypes[inputAmount] = InputType.recurrent(1);

        ComputationGraphConfiguration configuration = new NeuralNetConfiguration.Builder()
                .weightInit(WeightInit.XAVIER)
                .updater(new Adam(0.001))
                .seed(seed)
                .graphBuilder()
                .addInputs(inAndOutNames)
                .setInputTypes(inputTypes)
                .addLayer("l1", new LSTM.Builder().nIn(inputAmount).nOut(128).activation(Activation.TANH).build(), inputNames)
                .addLayer("l2", new LSTM.Builder().nIn(128).nOut(256).build(), "l1", "p")
                .addLayer("lOut", new DenseLayer.Builder().nIn(256).nOut(10).build(), "l2")
                .setOutputs("lOut")
                .build();

        model = new ComputationGraph(configuration);

        model.init();
        model.setListeners(new ScoreIterationListener(iterationsBetweenScores));

I already tried several different combinations of layers and vertices, but could not yet find anything that worked.


Solution

  • You're setting more than 1 input on an LSTM. In order to use multiple inputs in a neural network you need to actually assemble it to do that. That's true for any framework. No framework allows you to just arbitrarily add inputs to any arbitrary layer. If a layer accepts 1 input it only accepts 1 input.

    Your network can accept any number of inputs but at some point it needs to converge to 1 path in order for most things to work. You can do this with the concat layer.

    An example of our MergeVertex here:

    ComputationGraphConfiguration conf = new NeuralNetConfiguration.Builder()
            .updater(new Sgd(0.01))
        .graphBuilder()
        .addInputs("input1", "input2")
        .addLayer("L1", new DenseLayer.Builder().nIn(3).nOut(4).build(), "input1")
        .addLayer("L2", new DenseLayer.Builder().nIn(3).nOut(4).build(), "input2")
        .addVertex("merge", new MergeVertex(), "L1", "L2")
        .addLayer("out", new OutputLayer.Builder().nIn(4+4).nOut(3).build(), "merge")
        .setOutputs("out")
        .build();