androidkotlinmachine-learningtensorflow-litetflite

TensorFlow Lite: java.lang.AssertionError: "Does not support data type INT32" in Android Studio


I am developing an Android application in Android Studio and using a TensorFlow Lite model. When running the app, I encounter the following error: java.lang.AssertionError: TensorFlow Lite does not support data type INT32
Below is the relevant part of my code:

// Prepare input tensor
val inputFeature0 = TensorBuffer.createFixedSize(inputShape, DataType.FLOAT32)
inputFeature0.loadArray(flatArray)

// Run inference
val outputs = model?.process(inputFeature0)
val rawOutputBuffer = outputs?.outputFeature0AsTensorBuffer

// Extract raw data as IntArray or FloatArray based on the data type
val outputArray = when (rawOutputBuffer?.dataType) {
    DataType.INT32 -> rawOutputBuffer.intArray // Directly access INT32 data
    DataType.FLOAT32 -> rawOutputBuffer.floatArray.map { it.toInt() }.toIntArray() // Convert FloatArray to IntArray
    else -> throw IllegalArgumentException("Unsupported output tensor data type: ${rawOutputBuffer?.dataType}")
}
  1. The input tensor is of type FLOAT32, and the input data is loaded correctly using TensorBuffer.createFixedSize()and loadArray().

  2. When processing the model's output tensor (outputFeature0AsTensorBuffer), I added checks to handle both FLOAT32and INT32 outputs.

  3. Despite this, the app crashes with the error indicating that TensorFlow Lite does not support INT32.

What I have tried:

Ensured that the input tensor uses FLOAT32. Verified that the TensorFlow Lite model is compatible with the FLOAT32 data type. Checked the output tensor data type and added handling for both FLOAT32 and INT32. What I expected: I expected the model inference to run without issues since I have handled both FLOAT32 and INT32 output cases.

Questions:

Why does TensorFlow Lite throw this error even when FLOAT32 is used for the input tensor? Is the error related to how TensorFlow Lite internally manages data types like tensor dimensions or metadata? How can I resolve this error and ensure that TensorFlow Lite runs inference successfully?


Solution

  • Выведите выходной тензор, и проверьте его тип данных, больше похоже на баг самого TensorFlow

    Если действительно проблема в выходном тензоре попробуйте вот это преобразовании

    val outputArray = when (rawOutputBuffer?.dataType) {
    DataType.FLOAT32 -> rawOutputBuffer.floatArray.map { it.toInt() }.toIntArray() // Convert FloatArray to IntArray
    else -> throw IllegalArgumentException("Unsupported output tensor data type: ${rawOutputBuffer?.dataType}")
    

    }

    Надеюсь вы решите свою проблему ))