In the new iOS 9.1 when sets the AudioSession requesting a fix buffer size, the OS returns a smaller buffer. Why does that happen?? In the early versions <9.1 it worked as a charm
// Create a new audio input queue
OSStatus result = AudioQueueNewInput(&mAudioFormat,
IOSAudioRecorder::RecorderCallback,
this, // userData
nullptr, // run loop
kCFRunLoopCommonModes, // run loop mode
0, // flags
&mAudioQueue);
if (result != 0)
{
Logger::Error(this, "Failed to create new audio input queue, result: ", result);
mAudioQueue = nullptr;
}
else
{
// Allocate memory for the buffers
for (unsigned int i = 0; i < mNumBuffers; i++)
{
AudioQueueAllocateBuffer(mAudioQueue, mBufferFrames * sizeof(short), &mInputBuffer[i]);
mOutputBuffer[i] = new short[mBufferFrames];
}
}
And in the "RecorderCallback" I receive buffers smaller than the requested.
Any clue why does that happen?
The buffer size specified by this Audio Queue allocation call is a maximum of what can be returned. The iOS device hardware is free to return less data. You app has to accommodate this potentially smaller size, and/or may have to concatenate several returned buffers (in a lock-free circular FIFO for example) if you need a fixed length chunk of data.