I am trying to use AVFoundation framework to capture a 'series' of still images from AVCaptureStillImageOutput QUICKLY, like the burst mode in some cameras. I want to use the completion handler,
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
and pass the imageSampleBuffer to an NSOperation object for later processing. However i cant find a way to retain the buffer in the NSOperation class.
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
//Add to queue
SaveImageDataOperation *saveOperation = [[SaveImageDataOperation alloc] initWithImageBuffer:imageSampleBuffer];
[_saveDataQueue addOperation:saveOperation];
[saveOperation release];
//Continue
[self captureCompleted];
}];
Does any one know what I maybe doing wrong here? Is there a better approach to do this?
IMPORTANT: Clients of CMSampleBuffer must explicitly manage the retain count by calling CFRetain and CFRelease, even in processes using garbage collection.
SOURCE: CoreMedia.Framework CMSampleBuffer.h