I have an audio synthesis app that uses buffersize and a seekbar value to change the tempo. The smaller the buffer size, the faster the tempo. The seekbar value is subtracted from the buffer size; the further the seekbar is moved, the faster the tempo.
Buffersize is calculated using:
int buffersize = AudioTrack.getMinBufferSize(sr, AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_16BIT);
On a Samsung Galaxy S6 handset, the returned buffersize is 10584. The bottom and top of the slider values give 60 bpm and 192 bpm respectively.
However, on a 7" LG G Pad 7.0, the returned buffersize is only 3528. As a result, the starting tempo is around 180-200 bpm, rather than the desired 60 bpm.
I was thinking of using a hard-coding buffer size based on screen size, but that's just band-aid work.
Why is the same calculation returning two different buffer sizes across the two devices? How is the buffer size calculated?
The buffer size is known to be different between devices. It is optimized for the underlying hardware which will vary. That's why there is a method to compute this value rather that some instruction on how to compute it yourself.
An AudioTrack buffer size has nothing to do with tempo. It's just the minimum number of bytes that that an AudioTrack must allocate in order to operate effectively. The bytes you send to an AudioTrack can contain whatever you want to play as long as it conforms to the specs that you used to create the AudioTrack.