javascripttensorflowtensorflow.js

How do you call tf.loadGraphModelSync with [ModelJSON, ArrayBuffer]


I'm trying to make a self-contained html document which contains and can run a tensorflow js model. I saved my model from python as a .h5, and then converted it with tensorflowjs_convert --input_format keras ... it. That resulted in a directory which contained a model.json (which looked and is loading fine as far as I can tell) and the weights all stored in a single file group1-shard1of1.bin.

I took the contents of each of those files, base64 encoded them and then stuck them into my html file as two data URIs.

const modelURI = "data:application/json;base64,<Data Here>";
const weightsURI = "data:application/octet-stream;base64,<Data Here>";

Promise.all([fetch(modelURI), fetch(weightsURI)])
    .then(function(resp) {
        return Promise.all([resp[0].json(), resp[1].arrayBuffer()]);
    })
    .then(function(data) {
        tf.loadGraphModelSync([data[0], data[1]]);
    });

The problem is that loadGraphModelSync keeps complaining about

Uncaught (in promise) TypeError: n is null
    value graph_model.js:212
    value graph_model.js:180
    value graph_model.js:167
    loadGraphModelSync graph_model.js:699

I'm pretty sure that the problem is with the ArrayBuffer I'm passing in, but I'm not sufficiently experienced with tfjs or even plain tensorflow to know.

I checked out the the pull request that added in the option and it mentioned that "the ArrayBuffer is a list of concatenated weights for the model" but I couldn't find any available information on what exactly that would look like. Does anybody happen to know what the weights I'm passing in here should be formatted as? An alternative way to load the base64 encoded data would also be helpful.

Edit 1: After failing to figure out what was causing the issue in the minified tensorflow js bundle I created a quick bun project and installed the tensorflow js npm package which allowed me to get a more helpful error:

32091 |             if (metadata.structuredOutputKeys != null) {
32092 |                 this.structuredOutputKeys = metadata.structuredOutputKeys;
32093 |             }
32094 |         }
32095 |         this.signature = signature;
32096 |         this.version = "".concat(graph.versions.producer, ".").concat(graph.versions.minConsumer);
                                         ^
TypeError: null is not an object (evaluating 'graph.versions')
      at .../node_modules/@tensorflow/tfjs-converter/dist/tf-converter.node.js:32096:34
      at .../node_modules/@tensorflow/tfjs-converter/dist/tf-converter.node.js:32063:16
      at .../node_modules/@tensorflow/tfjs-converter/dist/tf-converter.node.js:32053:16
      at loadGraphModelSync (.../node_modules/@tensorflow/tfjs-converter/dist/tf-converter.node.js:32577:5)
      at .../src/index.ts:14:9

Solution

  • The issue was unrelated to the model weights because the Keras model format is a weights-only format (source). Because of that the modelTopology field in the model.json was null, which was causing the error.

    As I read through the tfjs source code I stumbled upon two relevant functions: encodeWeights and decodeWeights. The doc comment over decodeWeights and a read-through of the function gave me the understanding I wanted. In my eyes "A flat ArrayBuffer or an array of ArrayBuffers carrying the binary values of the tensors concatenated in the order specified in specs." is a much more precise explanation of what the function expects and by extension what the ArrayBuffer expected by loadGraphModelSync should look like. And once I saved my model as a SavedModel, then reconverted, the model now seems to be loading without issue.