iosswiftcore-imagecifilterdng

Save CIFilter as RAW image


So CIFilter takes in a RAW DNG image. Is there a way I can output the image back as a RAW?

I'm trying to blend RAW images loaded in as CIFilter's and then output them as a RAW back. It's a simple interpolation of the buffer's RAW values. Yes I understand that RAW photos don't have pixels. I'm just trying to work with the underlying BAYER filter representation here. Is there any way to do this using CIFilter. How about using a third party library?

EDIT: What I'm Trying to Do:

So I'm trying to blend (something like exposure stacking) multiple images taken using a Bracketed Photo Capture. So I have multiple RAW photos' pixel buffer and I'm trying to average them out their pixel values.

Before doing that, I'm trying to confirm that my pixel values are correct. So I take a photo of a red green or blue sheet of paper so that it is all one solid color(ish), and I try to see if the pixel buffer values are what I would expect. iPhones have RGGB bayer layout so I'd expect a very solid red picture to have the following layout:

        Col0 Col1 Col3 Col4...
Row 0: [HIGH LOW  HIGH LOW ....]
Row 1: [LOW  LOW  LOW  LOW ....]
Row 2: [HIGH LOW  LOW  LOW ....] 

I have the following code for that:

func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
        guard error == nil else { print("Error capturing photo: \(error!)"); return }

        guard let pixelBuffer = photo.pixelBuffer else {return}
        guard let baseAddress = CVPixelBufferGetBaseAddress(pixelBuffer) else {return}
        let bufferHeight = Int(CVPixelBufferGetHeight(pixelBuffer))

        let uint16Buffer = baseAddress.assumingMemoryBound(to: UInt16.self)

        for row in 0..<bufferHeight {
            let col0 = uint16Buffer[row*bufferWidth + 0]
            let col1 = uint16Buffer[row*bufferWidth + 1]
            let col2 = uint16Buffer[row*bufferWidth + 2]
            let col3 = uint16Buffer[row*bufferWidth + 3]
            print(col0, col1, col2, col3) // See pixel values
            if (row > 3) {break} // don't print more than 4 rows for debugging
        }

    }

Solution

  • There is no way to do what you want with Core Image. You can "only" develop and process RAW input on a pixel basis with CIFilters.

    However, you can access the data of the CVPixelBuffers directly with CVPixelBufferGetBaseAddress() (and some proper locking). So you could do your blending on the CPU and maybe use vDSP to improve the performance.