I have an application that adds some live animations and images to preview view in AV Foundation camera. I can do "hardware screenshot" (holding the Side button and Volume Up button) and it's ok. However, I need a button that makes a screenshot.
All the methods of taking screenshot like UIGraphicsGetImageFromCurrentImageContext
(or view.drawHierarchy()
) result in black screen where video preview is. All other elements are on the screenshot and images are visible except AVCaptureVideoPreviewLayer
.
Please help me. Can I do "hardware screenshot"? Is exist another solution to that problem?
I was in the same position, and researched two separate solutions to this problem.
Set up the ViewController as an AVCaptureVideoDataOutputSampleBufferDelegate and sample the video output to take the screenshot.
Set up the ViewController as an AVCapturePhotoCaptureDelegate and capture the photo.
The mechanism for setting up the former is described in this question for example: How to take UIImage of AVCaptureVideoPreviewLayer instead of AVCapturePhotoOutput capture
I implemented both to check if there was any difference in the quality of the image (there wasn't).
If all you need is the camera snapshot, then that's it. But it sounds like you need to draw an additional animation on top. For this, I created a container UIView of the same size as the snapshot, added a UIImageView to it with the snapshot and then drew the animation on top. After that you can use UIGraphicsGetImageFromCurrentImageContext on the container.
As for which of solutions (1) and (2) to use, if you don't need to support different camera orientations in the app, it probably doesn't matter. However, if you need to switch between front and back camera and support different camera orientations, then you need to know the snapshot orientation to apply the animation in the right place, and getting that right turned out to be a total bear with method (1).
The solution I used:
UIViewController extends AVCapturePhotoCaptureDelegate
Add photo output to the AVCaptureSession
private let session = AVCaptureSession()
private let photoOutput = AVCapturePhotoOutput()
....
// When configuring the session
if self.session.canAddOutput(self.photoOutput) {
self.session.addOutput(self.photoOutput)
self.photoOutput.isHighResolutionCaptureEnabled = true
}
let settings = AVCapturePhotoSettings()
let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first!
let previewFormat = [
kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,
kCVPixelBufferWidthKey as String: 160,
kCVPixelBufferHeightKey as String: 160
]
settings.previewPhotoFormat = previewFormat
photoOutput.capturePhoto(with: settings, delegate: self)
func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
guard error == nil else {
// Do something
// return
}
if let dataImage = photo.fileDataRepresentation() {
print(UIImage(data: dataImage)?.size as Any)
let dataProvider = CGDataProvider(data: dataImage as CFData)
let cgImageRef: CGImage! = CGImage(jpegDataProviderSource: dataProvider!, decode: nil, shouldInterpolate: true, intent: .defaultIntent)
//https://developer.apple.com/documentation/uikit/uiimageorientation?language=objc
let orientation = UIApplication.shared.statusBarOrientation
var imageOrientation = UIImage.Orientation.right
switch orientation {
case .portrait:
imageOrientation = self.cameraPosition == .back ? UIImage.Orientation.right : UIImage.Orientation.leftMirrored
case .landscapeRight:
imageOrientation = self.cameraPosition == .back ? UIImage.Orientation.up : UIImage.Orientation.downMirrored
case .portraitUpsideDown:
imageOrientation = self.cameraPosition == .back ? UIImage.Orientation.left : UIImage.Orientation.rightMirrored
case .landscapeLeft:
imageOrientation = self.cameraPosition == .back ? UIImage.Orientation.down : UIImage.Orientation.upMirrored
case .unknown:
imageOrientation = self.cameraPosition == .back ? UIImage.Orientation.right : UIImage.Orientation.leftMirrored
@unknown default:
imageOrientation = self.cameraPosition == .back ? UIImage.Orientation.right : UIImage.Orientation.leftMirrored
}
let image = UIImage.init(cgImage: cgImageRef, scale: 1.0, orientation: imageOrientation)
// Do whatever you need to do with the image
} else {
// Handle error
}
}
If you need to know the size of the image to position the animations you can use the AVCaptureVideoDataOutputSampleBufferDelegate strategy to detect the size of the buffer once.