I am trying to use the AVCamFilter
Apple sample project discussed in this WWDC session to get depth data using the dual camera. The project has built-in features to get depth data from the dual camera.
When the sample project was written builtInDualWideCamera
didn't exist yet, and the project only tries to get builtInDualCamera
and builtInWideAngleCamera
. When I run the project on my iPad Pro it doesn't show any of the depth-related UI because the device doesn't have a builtInDualCamera
device. So I added builtInDualWideCamera
in to the videoDeviceDiscoverySession
, and it seems to get that device properly, but isDepthDataDeliverySupported
is returning false
still.
Is there some reason why isDepthDataDeliverySupported
is false even though I seem to be using a dual camera device?
I know the device has a builtInLiDARDepthCamera
but I wanted to try out the dual camera depth data to see how it performs for shorter distances. I wouldn't have expected the dual camera depth data delivery to be made unavailable on the device just because the LiDAR sensor is already available.
Using iPadOS 17.5.1, iPad Pro 11-inch 4th generation.
The depth feature of this sample app works fine on an iPhone 15 I tested. Also tried on an iPhone 15 Pro and it worked even though that device also has a LiDAR sensor, so the issue is presumably not related to the fact that the iPad Pro has a LiDAR sensor.
Apple engineer responded on my developer forum post saying:
We do not support stereo (wide + ultrawide, aka dual wide) depth data delivery on iPads. We do support it on iPhones though.