objective-cswiftavcapturesessionmetalcmsamplebufferref

Understanding Metal usage with AVCaptureSession video output


I'm trying to understand what is the right way to manipulate Video output(CMPixelBuffer), using Metal.

As far as i understand there is MTKView. Each CMPixelBuffer, from the video output, is being assigned to some what of Metal Texture. So the final preview is the from the MTKView?

When i see the final result on the screen, is it :

1)CMSampleBuffer->Metal->CMSampleBuffer

or

2)CMSampleBuffer->Metal->MTKView

Pretty confused. Can someone put things on spot?


Solution

  • I'd encourage you to do this in two steps. First, get comfortable using Metal and MTKView to draw something, anything, to the screen. Second, after you know how to draw with Metal (create and encode to command buffers, present drawables, etc), you can apply the texture you generate from the CMSampleBuffer to a full-screen quad or whatever other geometry you want.

    There's a camera capture sample by @McZonk available here that implements a simple example of this. It uses a two-plane YCbCr sample buffer, but if available, you can also request a BGRA formatted sample buffer that can then be converted into a single MTLTexture with CVMetalTextureCacheCreateTextureFromImage and sampled in a shader without further manual conversion.