javascripthtmltensorflowtensorflow.jstensorflowjs-converter

Keep getting: "Attempted import error: 'loadFrozenModel' is not exported from '@tensorflow/tfjs-converter'"


I'm trying to convert a Tensorflow-for-Poets model to a Tensorflow.js model, so I can use the in a front-end environment, like a website. I was trying to follow this tutorial: https://gist.github.com/woudsma/d01eeda8998c9ab972d05ec9e9843886

I've followed all the directions but when I try to launch localhost, I keep getting the titular error:

src/index.js
Attempted import error: 'loadFrozenModel' is not exported from 
'@tensorflow/tfjs-converter'

I've trained a Tensorflow model using:

I also looked at these previously asked questions:

http://www.github.com/tensorflow/tfjs/issues/149

http://www.stackoverflow.com/questions/49718162/tfjs-converter-html-javascript-trouble-importing-class

But this hasn't fixed my issue.

This is the example-project I've found in the tutorial. It contains the that I also used in my project. https://github.com/woudsma/retrain-mobilenet-for-the-web

I can't find anything about this specific error, does anyone know what's going wrong?

PS:This is also my first question posted to Stack Overflow, so let me know if something is missing from / wrong about this post.

EDIT: Added my index.js:

import { loadFrozenModel } from '@tensorflow/tfjs-converter'
import labels from './labels.json'

const ASSETS_URL = `${window.location.origin}/assets`
const MODEL_URL = `${ASSETS_URL}/mobilenet-v2/tensorflowjs_model.pb`
const WEIGHTS_URL = `${ASSETS_URL}/mobilenet-v2/weights_manifest.json`
const IMAGE_SIZE = 224 // Model input size

const loadModel = async () => {
  const model = await loadFrozenModel(MODEL_URL, WEIGHTS_URL)
  const input = tf.zeros([1, IMAGE_SIZE, IMAGE_SIZE, 3])
   // Warm up GPU
  // model.predict({ input }) // MobileNet V1
  model.predict({ Placeholder: input }) // MobileNet V2
  return model
}

const predict = async (img, model) => {
  const t0 = performance.now()
  const image = tf.fromPixels(img).toFloat()
  const resized = tf.image.resizeBilinear(image, [IMAGE_SIZE, IMAGE_SIZE])
  const offset = tf.scalar(255 / 2)
  const normalized = resized.sub(offset).div(offset)
  const input = normalized.expandDims(0)
  // const output = await tf.tidy(() => model.predict({ input })).data() 

// MobileNet V2
  const predictions = labels
    .map((label, index) => ({ label, accuracy: output[index] }))
    .sort((a, b) => b.accuracy - a.accuracy)
  const time = `${(performance.now() - t0).toFixed(1)} ms`
  return { predictions, time }
}

const start = async () => {
  const input = document.getElementById('input')
  const output = document.getElementById('output')
  const model = await loadModel()
  const predictions = await predict(input, model)
  output.append(JSON.stringify(predictions, null, 2))
}

start()

EDIT: I also added the HTML-file, just to be sure.

<!DOCTYPE html>
<html lang="en">
  <head>
    <title>Image classifier</title>
  </head>
  <body>
    <img id="input" src="assets/images/some-flower.jpg" />
    <pre id="output"></pre>
  </body>
<script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs@latest"></script>
</html>

Solution

  • import { loadFrozenModel } from '@tensorflow/tfjs-converter'

    loadFrozenModel is not exported from @tensorflow/tfjs-converter. It is rather in the namespace of @tensorflow/tfjs. Since you've already imported the CDN scripts, you only need to load the model using tf.loadFrozenModel

    const model = await loadFrozenModel(MODEL_URL, WEIGHTS_URL)
    

    Also tf.fromPixels has been changed to tf.browser.fromPixels