I'm trying to play a video (MP4/H.263) on iOS, but getting really fuzzy results. Here's the code to initialize the asset reading:
mTextureHandle = [self createTexture:CGSizeMake(400,400)];
NSURL * url = [NSURL fileURLWithPath:file];
mAsset = [[AVURLAsset alloc] initWithURL:url options:NULL];
NSArray * tracks = [mAsset tracksWithMediaType:AVMediaTypeVideo];
mTrack = [tracks objectAtIndex:0];
NSLog(@"Tracks: %i", [tracks count]);
NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary * settings = [[NSDictionary alloc] initWithObjectsAndKeys:value, key, nil];
mOutput = [[AVAssetReaderTrackOutput alloc]
initWithTrack:mTrack outputSettings:settings];
mReader = [[AVAssetReader alloc] initWithAsset:mAsset error:nil];
[mReader addOutput:mOutput];
So much for the reader init, now the actual texturing:
CMSampleBufferRef sampleBuffer = [mOutput copyNextSampleBuffer];
CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress( pixelBuffer, 0 );
glBindTexture(GL_TEXTURE_2D, mTextureHandle);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, 600, 400, 0, GL_BGRA_EXT, GL_UNSIGNED_BYTE, CVPixelBufferGetBaseAddress( pixelBuffer ));
CVPixelBufferUnlockBaseAddress( pixelBuffer, 0 );
CFRelease(sampleBuffer);
Everything works well ... except the rendered image looks like this; sliced and skewed?
I've even tried looking into AVAssetTrack
's preferred transformation matrix, to no avail, since it always returns CGAffineTransformIdentity
.
Side-note: If I switch the source to camera, the image gets rendered fine. Am I missing some decompression step? Shouldn't that be handled by the asset reader?
Thanks!
I think the CMSampleBuffer uses a padding for performance reason, so you need to have the right width for the texture.
Try to set width of the texture with : CVPixelBufferGetBytesPerRow(pixelBuffer) / 4 (if your video format uses 4 bytes per pixel, change if other)