iosswiftimage-processingcore-videovimage

Scale image in CVImageBuffer


I have a task - to scale down an image which I got from the camera. I need it in order to do heavy-lifting operations on smaller version of the image which will help me to save some processing power.

I decided to go with vImage_Buffer from Accelerate. Here's my code with few comments just to give clear understanding what is what there:

        guard let imgBuffer = CMSampleBufferGetImageBuffer(buffer) else {
            return
        }


        CVPixelBufferLockBaseAddress(imgBuffer, CVPixelBufferLockFlags(rawValue: 0))

        // create vImage_Buffer out of CVImageBuffer
        var inBuff: vImage_Buffer = vImage_Buffer()
        inBuff.width = UInt(CVPixelBufferGetWidth(imgBuffer))
        inBuff.height = UInt(CVPixelBufferGetHeight(imgBuffer))
        inBuff.rowBytes = CVPixelBufferGetBytesPerRow(imgBuffer)
        inBuff.data = CVPixelBufferGetBaseAddress(imgBuffer)

        // bring down the size at half
        let new_width: UInt = inBuff.width/2
        let new_height: UInt = inBuff.height/2

        // create output buffer where scaled image is supposed to be
        var outBuff: vImage_Buffer = vImage_Buffer()
        outBuff.data = UnsafeMutableRawPointer.allocate(byteCount: Int(new_width * new_height * 4), alignment: MemoryLayout<UInt>.size)
        outBuff.width = new_width
        outBuff.height = new_height
        outBuff.rowBytes = Int(new_width * 4)

        // perform scale
        let err = vImageScale_ARGB8888(&inBuff, &outBuff, nil, 0)
        if err != kvImageNoError {
            print("Wrong!")
        }

        // I guess I need to unlock buffer at this point, right?
        CVPixelBufferUnlockBaseAddress(imgBuffer, CVPixelBufferLockFlags(rawValue: 0))

        // create CVImageBuffer
        let options = [kCVPixelBufferCGImageCompatibilityKey: true,
                       kCVPixelBufferCGBitmapContextCompatibilityKey: true,
                       kCVPixelBufferWidthKey: new_width,
                       kCVPixelBufferHeightKey: new_height] as CFDictionary

        var newPixelBuffer: CVImageBuffer?
        let status = CVPixelBufferCreateWithBytes(kCFAllocatorDefault,
                                                  Int(new_width), Int(new_height),
                                                  kCVPixelFormatType_32BGRA, &outBuff, Int(new_width * 4),
                                                  nil, nil, options, &newPixelBuffer)

        if status == kCVReturnError {
            print("Wrong again!")
        }

        // create CIImage from CVImageBuffer and UIImage from CIImage just to see how scale went
        let ciImg = CIImage(cvImageBuffer: newPixelBuffer!)
        let img = UIImage(ciImage: ciImg)
        delegate?.testSmallImage(img)

It seems like all operations are performed without any errors and I want to check how scale went so I'm trying to create new UIImage out of scaled buffer. But when I try to show an image with UIImageView, I have EXC_BAD_ACCESS error. And when I try to save brand-new UIImage, everything goes without errors, but no file appears in Documents directory. Can you point me out what exactly am I doing wrong here? Thanks!


Solution

  • Here's the code snippet on Swift which resizes CMSampleBuffer:

        private func scale(_ sampleBuffer: CMSampleBuffer) -> CVImageBuffer?
        {
            guard let imgBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else {
                return nil
            }
    
            CVPixelBufferLockBaseAddress(imgBuffer, CVPixelBufferLockFlags(rawValue: 0))
    
            // create vImage_Buffer out of CVImageBuffer
            var inBuff: vImage_Buffer = vImage_Buffer()
            inBuff.width = UInt(CVPixelBufferGetWidth(imgBuffer))
            inBuff.height = UInt(CVPixelBufferGetHeight(imgBuffer))
            inBuff.rowBytes = CVPixelBufferGetBytesPerRow(imgBuffer)
            inBuff.data = CVPixelBufferGetBaseAddress(imgBuffer)
    
            // perform scale
            var err = vImageScale_ARGB8888(&inBuff, &scaleBuffer, nil, 0)
            if err != kvImageNoError {
                print("Can't scale a buffer")
                return nil
            }
            CVPixelBufferUnlockBaseAddress(imgBuffer, CVPixelBufferLockFlags(rawValue: 0))
    
            var newBuffer: CVPixelBuffer?
            let attributes : [NSObject:AnyObject] = [
                kCVPixelBufferCGImageCompatibilityKey : true as AnyObject,
                kCVPixelBufferCGBitmapContextCompatibilityKey : true as AnyObject
            ]
    
            let status = CVPixelBufferCreateWithBytes(kCFAllocatorDefault,
                                                      Int(scaleBuffer.width), Int(scaleBuffer.height),
                                                      kCVPixelFormatType_32BGRA, scaleBuffer.data,
                                                      Int(scaleBuffer.width) * 4,
                                                      nil, nil,
                                                      attributes as CFDictionary?, &newBuffer)
    
            guard status == kCVReturnSuccess, let b = newBuffer else {
                print("Can't create new CVPixelBuffer")
                return nil
            }
    
            return b
        }
    

    And here's a definition of scaleBuffer which acts as a destination in scale operation. I do not need to create it each scale so I do it only once:

            scaleBuffer.data = UnsafeMutableRawPointer.allocate(byteCount: Int(new_width * new_height * 4), alignment: MemoryLayout<UInt>.size)
            scaleBuffer.width = vImagePixelCount(new_width)
            scaleBuffer.height = vImagePixelCount(new_height)
            scaleBuffer.rowBytes = Int(new_width * 4)