I am trying to get a camera preview to load with live video as soon as a view controller loads for a profile photo. However, the video is not displaying at first. This is with a real iphone, not the simulator.
While the video does not display at first, if you open up the Camera app and switch from Back Facing to Front Facing one or more times and then go back to the app, the video does display.
What am I missing to get it to display video immediately.
Thank you for any suggestions.
Swift camera code
@objc func showLivePreview(view: UIView) {
var myPreviewLayer = AVCaptureVideoPreviewLayer()
let captureSession = AVCaptureSession()
var backCamera: AVCaptureDevice?
var frontCamera: AVCaptureDevice?
var currentCamera: AVCaptureDevice?
var videoOutput = AVCaptureVideoDataOutput()
// Step 1: Set up the preview layer
myPreviewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
myPreviewLayer.videoGravity = .resizeAspectFill
myPreviewLayer.connection?.videoOrientation = .portrait
captureSession.sessionPreset = AVCaptureSession.Preset.photo
let session = AVCaptureDevice.DiscoverySession(deviceTypes: [.builtInWideAngleCamera], mediaType: .video, position: .front)
let devices = session.devices
currentCamera = devices.first
do{
let captureDeviceInput = try AVCaptureDeviceInput(device: currentCamera!)
print(captureDeviceInput)
captureSession.addInput(captureDeviceInput)
let videoOutput = AVCaptureMovieFileOutput()
captureSession.addOutput(videoOutput)
}catch{
return
}
view.layer.addSublayer(myPreviewLayer)
// Make layer rounded
view.layer.cornerRadius = view.bounds.width / 2
view.layer.masksToBounds = true
print("finished with layer. start camera")
myPreviewLayer.session = captureSession
let photoOutput = AVCapturePhotoOutput()
if captureSession.canAddOutput(photoOutput ) {
captureSession.addOutput(photoOutput )
}
// Position layer in UI
myPreviewLayer.frame = view.frame//was bounds
// Start the session in background thread as per Thread Performance Checker
DispatchQueue.global().async {
captureSession.startRunning()
}
The camera is called from a VC written in Objective-C that uses Storyboard. I have not yet tried to position the camera so the view self.view is just the main view of the storyboard.
objective-c
- (void)viewDidLoad {
[super viewDidLoad];
[[Utilities shared] showLivePreviewWithView:self.view];
}
As already discussed in comments, this happens because on viewDidLoad
the frame of the controller is not calculated yet. The first method of UIViewController
lifecycle where the frame is calculated is viewDidLayoutSubviews
so adjusting the frame here solves the problem as controller's frame is already set up and can be used to layout other views correctly.
var previewLayer: AVCaptureVideoPreviewLayer?
func showLivePreview(view: UIView) -> AVCaptureVideoPreviewLayer {
...
return previewLayer
}
override func viewDidLoad() {
super.viewDidLoad()
previewLayer = showLivePreview(view: view)
}
override func viewDidLayoutSubviews() {
super.viewDidLayoutSubviews()
previewLayer?.frame = view.bounds
}
A better option is always overriding layoutSubviews
(for UIView
) / layoutSublayers
(for CALayer
) methods. This ensures that views and layers are always positioned correctly, even after orientation changes or other layout updates:
final class PreviewLayer: CALayer {
override func layoutSublayers() {
super.layoutSublayers()
if let superlayer {
self.frame = superlayer.bounds
}
}
}
This way your views and layers will automatically adapt to new sizes.