With macOS 13, Apple added the Supporting Continuity Camera in your macOS app (ContinuityCam) sample app showing how you can use AVFoundation and Continuity Camera to allow a Mac app to use an iPhone as an external camera for your Mac app.
The key APIs are in AVCaptureDevice.DiscoverySession
and the externalUnknown
device type.
While working on bringing my iOS app to the Mac using Mac Catalyst, I wanted to make use of this feature. But the documentation for AVCaptureDevice.DeviceType.externalUnknown
shows that it is only for macOS 10.15+ and not Mac Catalyst. There's also a note stating:
In Mac Catalyst apps, use
builtInWideAngleCamera
instead.
But I was already using that device type and it only provides access to built-in cameras.
Is there any way to support Continuity Camera in a Mac Catalyst app? Other than the externalUnknown
device type, all other AVFoundation APIs used by the demo app for macOS are available for Mac Catalyst.
It turns out that it is possible. The key step is to replace the use of .externalUnknown
with .init(rawValue: "AVCaptureDeviceTypeExternalUnknown")
. That's pretty much all that is needed. With that change, the AVFoundation code for detecting iPhones as camera devices will work in a Mac Catalyst app.
However, there are some issues. The primary problem is that the live preview is mirrored when shown in the Mac Catalyst app when connected to an iPhone camera. If you query the AVCaptureDevice
, the position
property reports itself as .front
even though only the rear camera of the iPhone is used. And the deviceType
property reports itself as .builtInWideAngleCamera
instead of .externalUnknown
. On a MacBook Pro, the built-in camera reports the same values. When the same AVFoundation code is used in an actual macOS app, both cameras report a position of .unspecified
and the iPhone's deviceType
is the correct value of .externalUnknown
.
The trick to solving the mirror problem is to look at the modelID
property of the AVCaptureDevice
. When the device is a connected iPhone, the modelID
will be something like "iPhone15,4" or some similar model string.
Code like the following can be used to fix the mirroring issue:
if device.modelID.hasPrefix("iPhone") {
// iPhone
connection.automaticallyAdjustsVideoMirroring = false
connection.isVideoMirrored = false // This fixes the mirroring issue
} else {
// non-iPhone
connection.automaticallyAdjustsVideoMirroring = true
}
where device
is the AVCaptureDevice
being added as input to the capture session. connection
is the active AVCaptureConnection
of the session.
With those changes I was able to adapt Apple's ContinuityCam sample app code to allow me to scan barcodes in a Mac Catalyst app using either the Mac's camera or the camera of a connected iPhone.
For those wanting more details, the following are the complete steps and changes needed to convert the ContinuityCam sample app from a macOS app into a Mac Catalyst app.
You need Xcode 14.1+ running on macOS 13.0+. You need an iPhone XR or newer running iOS 16+.
I suggest building and running the project as an actual macOS app first so you can see what it should be doing and ensure you can get it working with your iPhone. Only then use the following changes to make it into a Mac Catalyst app.
Download the project and open it in Xcode. Select the ContinuityCam target and go to the General tab. In the Supported Destinations section, click + and add the "Mac (Mac Catalyst)" destination. This also adds an iPad destination. Then delete the original "Mac" destination.
Change the iOS Deployment target to 16.0.
On the "Signing and Capabilities" tab, make sure your Team is selected and all of the signing settings are what you would use for a project. Make sure "Camera" is selected in the "App Sandbox" section.
That should be it for the basic project changes.
Now you need to edit 4 of the Swift source files. Camera.swift, CameraPreview.swift, ConfigurationView.swift, and MaterialView.swift. The latter three are SwiftUI files making use of AppKit classes. They need to be updated to use UIKit classes.
CameraPreview.swift
CaptureVideoPreview
class with the following:class CaptureVideoPreview: UIView {
let previewLayer: AVCaptureVideoPreviewLayer
init(session: AVCaptureSession) {
previewLayer = AVCaptureVideoPreviewLayer(session: session)
super.init(frame: .zero)
// Creates a preview layer to use as the view's backing layer.
previewLayer.frame = bounds
previewLayer.videoGravity = .resizeAspectFill
previewLayer.backgroundColor = UIColor.black.cgColor
self.layer.addSublayer(previewLayer)
}
required init?(coder: NSCoder) {
fatalError("init(coder:) has not been implemented")
}
override func layoutSubviews() {
super.layoutSubviews()
// Keep the preview sized correctly
previewLayer.frame = bounds
}
}
ConfigurationView.swift
This only needs one line changed. In the disabledColor
property getter, replace the line with Color(uiColor: UIColor.darkGray)
.
MaterialView.swift
view.blendingMode = .behindWindow
line. Perhaps there's a UIKit replacement for that but I didn't care enough to bother.Camera.swift
This is where the important changes are but they are minimal.
setupDeviceDiscovery
method, replace the two uses of .externalUnknown
with .init(rawValue: "AVCaptureDeviceTypeExternalUnknown"
.addInput
method needs the code to fix the mirroring issue. Replace the provided addInput
method with the following:private func addInput(for device: AVCaptureDevice) throws -> AVCaptureDeviceInput {
let input = try AVCaptureDeviceInput(device: device)
if session.canAddInput(input) {
session.addInput(input)
// The following ensures the preview mirroring is correct
if (device.hasMediaType(.video)) {
print("Debug: Input device: \(device.localizedName), position: \(device.position), type: \(device.deviceType), uniqueID: \(device.uniqueID), modelID: \(device.modelID)")
let active = session.connections.filter { $0.isActive }
for connection in active {
if connection.isVideoMirroringSupported {
if device.modelID.hasPrefix("iPhone") || device.modelID.hasPrefix("iPad") { // I don't know if iPad devices will ever appear
print("Debug: iPhone")
connection.automaticallyAdjustsVideoMirroring = false
connection.isVideoMirrored = false
} else {
print("Debug: non iPhone")
connection.automaticallyAdjustsVideoMirroring = true
}
}
}
}
} else {
throw Error.setupFailed
}
return input
}
That's it. You should now (if I didn't miss anything) be able to build the sample app for your Mac. Run the app on your Mac, then connect a supported iPhone via USB cable. Your iPhone should appear as an option. Do note that there are a few AVFoundation APIs being used in this Mac Catalyst app that are not supported in iOS. A few changes are needed to allow this code to also run on an iPhone or iPad. I leave that as an exercise for the reader.
There you have it. Continuity Camera support in a Mac Catalyst app.