I'm working on a live app. I need to add filter into video buffer. Then I used GPUImage framework and write a filter. It looks well, but the buffer without any filter's effect in 'willOutputSampleBuffer:' function.
Here are some key code:
self.videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:self.sessionPreset cameraPosition:AVCaptureDevicePositionFront];
self.videoCamera.delegate = self;
self.videoCamera.horizontallyMirrorFrontFacingCamera = YES;
self.filterView = [[GPUImageView alloc] init];
GPUImageBeautifyFilter *beautifyFilter = [[GPUImageBeautifyFilter alloc] init];
[self.videoCamera addTarget:beautifyFilter];
[beautifyFilter addTarget:self.filterView];
dispatch_async(dispatch_get_main_queue(), ^{
[self.view insertSubview:self.filterView atIndex:1];
[self.filterView mas_makeConstraints:^(MASConstraintMaker *make) {
make.edges.equalTo(self.view);
}];
[self.videoCamera startCameraCapture];
});
Is there any detail I ignored? Thanks!!!
I need to add a new output to filter's target, so I add those code in my project, then I get the buffer with filter.
GPUImageRawDataOutput *rawDataOutput = [[GPUImageRawDataOutput alloc] initWithImageSize:CGSizeMake(720, 1280) resultsInBGRAFormat:YES];
[self.beautifyFilter addTarget:rawDataOutput];
__weak GPUImageRawDataOutput *weakOutput = rawDataOutput;
[rawDataOutput setNewFrameAvailableBlock:^{
__strong GPUImageRawDataOutput *strongOutput = weakOutput;
[strongOutput lockFramebufferForReading];
GLubyte *outputBytes = [strongOutput rawBytesForImage];
NSInteger bytesPerRow = [strongOutput bytesPerRowInOutput];
CVPixelBufferRef pixelBuffer = NULL;
CVPixelBufferCreateWithBytes(kCFAllocatorDefault, 720, 1280, kCVPixelFormatType_32BGRA, outputBytes, bytesPerRow, nil, nil, nil, &pixelBuffer);
//Do something with pixelBuffer
[strongOutput unlockFramebufferAfterReading];
CFRelease(pixelBuffer);
}];