iosswiftavcapturesessionavcapturedeviceavcaptureoutput

iPhone 7+, ios 11.2: Depth data delivery is not supported in the current configuration


This bug is driving me mad. I'm trying to produce the absolute minimal code to get AVDepthData from an iPhone 7+ using its DualCam.

I have this code:


//
//  RecorderViewController.swift
//  ios-recorder-app


import UIKit
import AVFoundation


class RecorderViewController: UIViewController {

    @IBOutlet weak var previewView: UIView!

    @IBAction func onTapTakePhoto(_ sender: Any) {

        guard let capturePhotoOutput = self.capturePhotoOutput else { return }

        let photoSettings = AVCapturePhotoSettings()

        photoSettings.isDepthDataDeliveryEnabled = true //Error

        capturePhotoOutput.capturePhoto(with: photoSettings, delegate: self)

    }

    var session: AVCaptureSession?
    var videoPreviewLayer: AVCaptureVideoPreviewLayer?
    var capturePhotoOutput: AVCapturePhotoOutput?


    override func viewDidLoad() {
        super.viewDidLoad()

        AVCaptureDevice.requestAccess(for: .video, completionHandler: { _ in })

        let captureDevice = AVCaptureDevice.default(.builtInDualCamera, for: .depthData, position: .back)

        do {
            print(captureDevice!)
            let input = try AVCaptureDeviceInput(device: captureDevice!)

            self.capturePhotoOutput = AVCapturePhotoOutput()
            self.capturePhotoOutput?.isDepthDataDeliveryEnabled = true //Error

            self.session = AVCaptureSession()
            self.session?.addInput(input)

            self.videoPreviewLayer = AVCaptureVideoPreviewLayer(session: self.session!)
            self.videoPreviewLayer?.videoGravity = AVLayerVideoGravity.resizeAspectFill
            self.videoPreviewLayer?.frame = view.layer.bounds
            previewView.layer.addSublayer(self.videoPreviewLayer!)

            self.session?.addOutput(self.capturePhotoOutput!)
            self.session?.startRunning()

        } catch {
            print(error)
        }

    }

    override func didReceiveMemoryWarning() {
        super.didReceiveMemoryWarning()
        // Dispose of any resources that can be recreated.
    }

}

extension RecorderViewController : AVCapturePhotoCaptureDelegate {

    func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
        print(photo.depthData)

    }


}

If I comment out the lines that are marked with "Error" the code works as I would expect, and prints nil for depthData.

However, leaving the lines as they are, I get an exception. The error message states: AVCapturePhotoOutput setDepthDataDeliveryEnabled:] Depth data delivery is not supported in the current configuration.

How do I change the "current configuration" so that depth delivery is supported?

I've watched this video: https://developer.apple.com/videos/play/wwdc2017/507/ which was helpful, and I believe I've followed the exact steps required to make this work.

Any tips would be gratefully received!


Solution

  • There are two things that I needed to fix.

    1. Set a sessionPreset to a format that supports depth, such as .photo.
    2. Add the cameraPhotoOutput to session before setting .isDepthDataDeliveryEnabled = true.

    Here is my minimal code for getting depth with photos:

    
    //
    //  RecorderViewController.swift
    //  ios-recorder-app
    //
    
    import UIKit
    import AVFoundation
    
    
    class RecorderViewController: UIViewController {
    
        @IBOutlet weak var previewView: UIView!
    
        @IBAction func onTapTakePhoto(_ sender: Any) {
    
            guard var capturePhotoOutput = self.capturePhotoOutput else { return }
    
            var photoSettings = AVCapturePhotoSettings()
            photoSettings.isDepthDataDeliveryEnabled = true
    
            capturePhotoOutput.capturePhoto(with: photoSettings, delegate: self)
    
        }
    
        var session: AVCaptureSession?
        var videoPreviewLayer: AVCaptureVideoPreviewLayer?
        var capturePhotoOutput: AVCapturePhotoOutput?
    
    
        override func viewDidLoad() {
            super.viewDidLoad()
    
            AVCaptureDevice.requestAccess(for: .video, completionHandler: { _ in })
    
            let captureDevice = AVCaptureDevice.default(.builtInDualCamera, for: .video, position: .back)
    
            print(captureDevice!.activeDepthDataFormat)
    
            do{
                let input = try AVCaptureDeviceInput(device: captureDevice!)
    
                self.capturePhotoOutput = AVCapturePhotoOutput()
    
                self.session = AVCaptureSession()
                self.session?.beginConfiguration()
                self.session?.sessionPreset = .photo
                self.session?.addInput(input)
    
                self.videoPreviewLayer = AVCaptureVideoPreviewLayer(session: self.session!)
                self.videoPreviewLayer?.videoGravity = AVLayerVideoGravity.resizeAspectFill
                self.videoPreviewLayer?.frame = self.view.layer.bounds
                self.previewView.layer.addSublayer(self.videoPreviewLayer!)
    
                self.session?.addOutput(self.capturePhotoOutput!)
                self.session?.commitConfiguration()
                self.capturePhotoOutput?.isDepthDataDeliveryEnabled = true
                self.session?.startRunning()
            }
            catch{
                print(error)
            }
    
        }
    
        override func didReceiveMemoryWarning() {
            super.didReceiveMemoryWarning()
            // Dispose of any resources that can be recreated.
        }
    
    }
    
    extension RecorderViewController : AVCapturePhotoCaptureDelegate {
    
        func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
            print(photo.depthData)
        }
    
    
    }