iosswiftcgcontextcmsamplebufferavassetwriterinput

Rotate CMSampleBuffer by arbitrary angle and append to AVAssetWriterInput in swift 3


I convert the sample buffer to a CGContext. Then I apply a transformation to the context and create a CIImage from that, which in turn gets displayed in an UIImageView.

At the same time I want to append this to the AVAssetWriterInput to create a movie of these transformations.

So far the transformations I apply to the context have no effect whatsoever. When I display the so called transformed image in the imageview. it looks exactly the same.

UPDATE: I managed to record the sample buffer to a video file (it's still stretched because of the wrong orientation though). I've used this code as a base

http://geek-is-stupid.github.io/blog/2017/04/13/how-to-record-detect-face-overlay-video-at-real-time-using-swift/

But I'm still struggling with applying the rotating to the CGContext. basically everything I do to the context is completely ignored.

func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {

        let writable = canWrite()
        if writable , sessionAtSourceTime == nil {
                print("starting session")
                sessionAtSourceTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer)
                assetWriter!.startSession(atSourceTime: sessionAtSourceTime!)
            }

        let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!
        if writable {
            autoreleasepool {
                CVPixelBufferLockBaseAddress(pixelBuffer, CVPixelBufferLockFlags(rawValue: 0))
                var renderedOutputPixelBuffer: CVPixelBuffer? = nil
                let options = [
                    kCVPixelBufferCGImageCompatibilityKey as String: true,
                    kCVPixelBufferCGBitmapContextCompatibilityKey as String: true,] as CFDictionary
                let status = CVPixelBufferCreate(kCFAllocatorDefault,
                                                 CVPixelBufferGetWidth(pixelBuffer),
                                                 CVPixelBufferGetHeight(pixelBuffer),
                                                 kCVPixelFormatType_32BGRA, options,
                                                 &renderedOutputPixelBuffer)
                guard status == kCVReturnSuccess else { return }

                CVPixelBufferLockBaseAddress(renderedOutputPixelBuffer!,CVPixelBufferLockFlags(rawValue: 0))

                let renderedOutputPixelBufferBaseAddress = CVPixelBufferGetBaseAddress(renderedOutputPixelBuffer!)

                memcpy(renderedOutputPixelBufferBaseAddress,CVPixelBufferGetBaseAddress(pixelBuffer),CVPixelBufferGetHeight(pixelBuffer) * CVPixelBufferGetBytesPerRow(pixelBuffer))

                CVPixelBufferLockBaseAddress(renderedOutputPixelBuffer!, CVPixelBufferLockFlags(rawValue: 0))

                let context = CGContext(data: renderedOutputPixelBufferBaseAddress,
                                        width: CVPixelBufferGetWidth(renderedOutputPixelBuffer!),
                                        height: CVPixelBufferGetHeight(renderedOutputPixelBuffer!),
                                        bitsPerComponent: 8,
                                        bytesPerRow: CVPixelBufferGetBytesPerRow(renderedOutputPixelBuffer!),
                                        space: CGColorSpaceCreateDeviceRGB(),
                                        bitmapInfo: bitmapInfo!)


                let radians : Float = atan2f(Float(boxView!.transform.b), Float(boxView!.transform.a));
                context!.translateBy(x: self.view.frame.size.width/2, y: self.view.frame.size.height/2)
                context!.rotate(by:CGFloat(radians))

                let image: CGImage = context!.makeImage()!

                self.imageView!.image = UIImage(cgImage: image)

                if (bufferAdaptor?.assetWriterInput.isReadyForMoreMediaData)!, canWrite() {
                   bufferAdaptor?.append(renderedOutputPixelBuffer!, withPresentationTime: CMSampleBufferGetPresentationTimeStamp(sampleBuffer))
                }

            CVPixelBufferUnlockBaseAddress(renderedOutputPixelBuffer!,CVPixelBufferLockFlags(rawValue: 0))
            CVPixelBufferUnlockBaseAddress(pixelBuffer,CVPixelBufferLockFlags(rawValue: 0))
        } 
    }

Solution

  • found the solution. Below the important part of the code.

       //create pixelbuffer from the delegate method samplebuffer
       let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!
       CVPixelBufferLockBaseAddress(pixelBuffer, CVPixelBufferLockFlags(rawValue: 0))
       //create CI image from the buffer
       let ci = CIImage.init(cvPixelBuffer: pixelBuffer, options: options)
       //create filter to rotate
       let filter = CIFilter.init(name: "CIAffineTransform")
       //create transform, move rotation point to center             
       var transform = CGAffineTransform(translationX: self.view.frame.midX, y: self.view.frame.midY)
       //rotate it
       transform = transform.rotate(angle: CGFloat(radians))
       // move the transform point back to the original
       transform = transform.translatedBy(x: -self.view.frame.midX, y: -self.view.frame.midY)
    
       filter!.setValue(transform, forKey: kCIInputTransformKey)
       filter!.setValue(ci, forKey: kCIInputImageKey)
       //take the output from the filter
       let output = filter?.outputImage
       //create empty pixelbuffer
       var newPixelBuffer : CVPixelBuffer? = nil
    
       CVPixelBufferCreate(kCFAllocatorDefault, Int(self.view.frame.width) ,
                                        Int(self.view.frame.height),
                                        kCVPixelFormatType_32BGRA,
                                        nil,
                                        &newPixelBuffer)
       //render the context to the new pixelbuffer, context is a global
       //CIContext variable. creating a new one each frame is too CPU intensive             
       context.render(output!, to: newPixelBuffer!)
    
       //finally, write this to the pixelbufferadaptor             
       if (bufferAdaptor?.assetWriterInput.isReadyForMoreMediaData)!, canWrite() {
           bufferAdaptor?.append(newPixelBuffer!, 
                          withPresentationTime: CMSampleBufferGetPresentationTimeStamp(sampleBuffer))
    
          }
    
       CVPixelBufferUnlockBaseAddress(pixelBuffer,CVPixelBufferLockFlags(rawValue: 0))