fluttercameratflite

Unable to Detect Emotion Using Flutter Camera and TFLite Model


I'm currently working on a Flutter project where I'm trying to detect emotions using my phone's camera and a TFLite model. However, when I run the program, it doesn't seem to detect any emotion. I've followed tutorials and checked my code multiple times, but I'm unable to find the issue. I'm seeking guidance on how to troubleshoot and resolve this problem.

Here's a summary of my setup and the issue:

  1. I'm using the camera and tflite_v2 packages in Flutter to access the device's camera and run the TFLite model for emotion detection.

  2. I've placed the TFLite model (model.tflite) and labels file (labels.txt) in the assets directory and referenced them correctly in my pubspec.yaml file.

  3. I've initialized the camera controller and started the image stream to capture frames from the camera.

  4. I'm running the TFLite model on each frame captured from the camera to detect emotions.

  5. However, when I run the program, it doesn't detect any emotion. The output remains empty, and there are no errors or exceptions thrown.

Here's my home.dart:

import 'package:camera/camera.dart';
import 'package:flutter/material.dart';
import 'package:tflite_v2/tflite_v2.dart';

class Home extends StatefulWidget {
  const Home({Key? key}) : super(key: key);

  @override
  State<Home> createState() => _HomeState();
}

class _HomeState extends State<Home> {
  late CameraImage? cameraImage;
  late CameraController? cameraController;
  String output = '';
  bool isModelBusy = false; // Track the status of the interpreter

  @override
  void initState() {
    super.initState();
    loadModel();
    loadCamera();
  }

  loadModel() async {
    await Tflite.loadModel(
      model: 'assets/model.tflite',
      labels: 'assets/labels.txt',
      numThreads: 1,
    );
  }

  loadCamera() async {
    final cameras = await availableCameras();
    if (cameras.isNotEmpty) {
      cameraController = CameraController(cameras[0], ResolutionPreset.medium);
      await cameraController!.initialize();
      setState(() {
        cameraController!.startImageStream((CameraImage image) {
          setState(() {
            cameraImage = image;
          });
          runModel();
        });
      });
    } else {
      // Handle the case where no cameras are available
      print('No cameras available');
    }
  }

  runModel() async {
    // Check if the interpreter is busy, exit if it is
    if (isModelBusy) {
      return;
    }
    isModelBusy = true; // Mark interpreter as busy
    if (cameraImage != null && cameraImage!.planes.isNotEmpty) {
      List<dynamic>? predictions = await Tflite.runModelOnFrame(
        bytesList: cameraImage!.planes.map((plane) {
          return plane.bytes;
        }).toList(),
        imageHeight: cameraImage!.height,
        imageWidth: cameraImage!.width,
        imageMean: 127.5,
        imageStd: 127.5,
        rotation: 90,
        numResults: 2,
        threshold: 0.1,
        asynch: true,
      );
      if (predictions != null && predictions.isNotEmpty) {
        setState(() {
          output = predictions[0]['label'];
        });
      }
    }
    isModelBusy = false; // Mark interpreter as not busy after inference is completed
  }

  @override
  Widget build(BuildContext context) {
    return Scaffold(
      appBar: AppBar(
        title: Text('Emotion Test'),
      ),
      body: Column(
        children: [
          Padding(
            padding: EdgeInsets.all(20),
            child: Container(
              height: MediaQuery.of(context).size.height * 0.7,
              width: MediaQuery.of(context).size.width,
              child: !cameraController!.value.isInitialized
                  ? Container()
                  : AspectRatio(
                      aspectRatio: cameraController!.value.aspectRatio,
                      child: CameraPreview(cameraController!),
                    ),
            ),
          ),
          Text(output, style: TextStyle(fontSize: 20)),
        ],
      ),
    );
  }

  @override
  void dispose() {
    cameraController?.dispose();
    Tflite.close();
    super.dispose();
  }
}

Additionally, This is the debug console logs I got:

W/CameraManagerGlobal(23445): [soar.cts] ignore the status update of camera: 6 W/CameraManagerGlobal(23445): [soar.cts] ignore the status update of camera: 7 W/CameraManagerGlobal(23445): ignore the torch status update of camera: 3 W/CameraManagerGlobal(23445): ignore the torch status update of camera: 4 W/CameraManagerGlobal(23445): ignore the torch status update of camera: 5 W/CameraManagerGlobal(23445): ignore the torch status update of camera: 6 2199 D/VRI : Cancelling draw. cancelDueToPreDrawListener=true cancelDueToSync=false Lost connection to device.

I tried changing versions of the dependencies, thinking it might be version compatibility issues. I tried changing packages, but it still didn't work.


Solution

    1. You can try to make the rotation as 0. Depending on the situation, 90 rotations may not needed. This rotation was necessary because the camera plugin outputs a 90-rotated image.
    2. You can play with the mean and the standard deviation. Some models may work with mean = 0 and std. = 255.
    3. I suggest you to use tflite_flutter package instead of tflite_v2 since it is the latest package.