I have a google-cloud-ml model that I can run prediction by passing a 3 dimensional array of float32...
{ 'instances' [ { 'input' : '[ [ [ 0.0 ], [ 0.5 ], [ 0.8 ] ] ... ] ]' } ] }
However this is not an efficient format to transmit images, so I'd like to pass base64 encoded png or jpeg. This document talks about doing that, but what is not clear is what the entire json object looks like. Does the { 'b64' : 'x0welkja...' }
go in place of the '[ [ [ 0.0 ], [ 0.5 ], [ 0.8 ] ] ... ] ]'
, leaving the enclosing 'instances' and 'input' the same? Or some other structure? Or does the tensorflow model have to be trained on base64?
The TensorFlow model does not have to be trained on base64 data. Leave your training graph as is. However, when exporting the model, you'll need to export a model that can accept PNG or jpeg (or possibly raw, if it's small) data. Then, when you export the model, you'll need to be sure to use a name for the output that ends in _bytes
. This signals to CloudML Engine
that you will be sending base64 encoded data. Putting it all together would like something like this:
from tensorflow.contrib.saved_model.python.saved_model import utils
# Shape of [None] means we can have a batch of images.
image = tf.placeholder(shape = [None], dtype = tf.string)
# Decode the image.
decoded = tf.image.decode_jpeg(image, channels=3)
# Do the rest of the processing.
scores = build_model(decoded)
# The input name needs to have "_bytes" suffix.
inputs = { 'image_bytes': image }
outputs = { 'scores': scores }
utils.simple_save(session, export_dir, inputs, outputs)
The request you send will look something like this:
{
"instances": [{
"b64": "x0welkja..."
}]
}