this is my first post here, I hope you can help me with this. I'm working on this code to get an AAC playback based on AudioQueue, I tested this with WAV format, and it works. The thing happens when I put either a .CAF or .M4A file and then I trigger the play method, this error comes up:
ERROR: >aq> 1608: failed (-66674); will stop (66150/0 frames)
I've been searching for this error code -66674 on Apple's Dev support documents and says that I have an error involving a problem with AudioQueuePrime or AudioQueueStart (right now AudioQueuePrime is not on the code, but I've been testing this also). I know that it's possible to use AAC w/AudioQueues, and in my case I believe this is the most reliable way to sync sounds accurately.
- (IBAction)play:(id)sender {
//OSStatus result;
NSArray *audioTracks = [NSArray arrayWithObjects:
@"/Users/mauro_ptt/Documents/XCODE/SimpleAQPlayViewController/Sample01.caf",
nil];
for (id object in audioTracks) {
// Open the audio file from an existing NSString path
NSURL *sndFileURL = [NSURL fileURLWithPath:object];
AudioFileOpenURL((__bridge CFURLRef)sndFileURL, kAudioFileReadPermission, 0, &mAudioFile);
// get audio format
UInt32 dataFormatSize = sizeof(mDataFormat);
AudioFileGetProperty(mAudioFile, kAudioFilePropertyDataFormat, &dataFormatSize, &mDataFormat);
// create playback queue
AudioQueueNewOutput(&mDataFormat, AQOutputCallback, (__bridge void *)(self), CFRunLoopGetCurrent(), kCFRunLoopCommonModes, 0, &mQueue);
// get buffer size, number of packets to read
UInt32 maxPacketSize;
UInt32 propertySize = sizeof(maxPacketSize);
// get the theoretical max packet size without scanning the entire file
AudioFileGetProperty(mAudioFile, kAudioFilePropertyPacketSizeUpperBound, &propertySize, &maxPacketSize);
// get sizes for up to 0.5 seconds of audio
DeriveBufferSize(mDataFormat, maxPacketSize, 0.5, &bufferByteSize, &mNumPacketsToRead);
// allocate packet descriptions array
bool isFormatVBR = (mDataFormat.mBytesPerPacket == 0 || mDataFormat.mFramesPerPacket == 0);
if (isFormatVBR) {
mPacketsDescs = (AudioStreamPacketDescription*) malloc(mNumPacketsToRead * sizeof(AudioStreamPacketDescription));
} else {
mPacketsDescs = NULL;
}
// Get magic cookie (COMPRESSED AAC)
UInt32 cookieSize = sizeof(UInt32);
OSStatus couldNotGetProperty = AudioFileGetPropertyInfo(mAudioFile, kAudioFilePropertyMagicCookieData, &cookieSize, NULL);
if ((couldNotGetProperty == noErr) && cookieSize) {
char *magicCookie = (char *) malloc(cookieSize);
AudioFileGetProperty(mAudioFile, kAudioFilePropertyMagicCookieData, &cookieSize, magicCookie);
AudioQueueSetProperty(mQueue, kAudioQueueProperty_MagicCookie, magicCookie, cookieSize);
free(magicCookie);
}
// Allocate and prime audio queue buffers
mCurrentPacket = 0;
for (int i=0; i < kNumberBuffers; ++i) {
AudioQueueAllocateBuffer(mQueue, bufferByteSize, &mBuffers[i]);
AQOutputCallback((__bridge void *)(self), mQueue, mBuffers[i]);
}
mIsRunning = true;
AudioQueueStart(mQueue, NULL);
}
}
Thanks in advance!
Was a problem regarding AQOutputCallback, must be careful on placing the right types on each function arguments.