So I'm writing an audio decoder for a pre-existing video decoder that works in libGDX. The problem is, when the audio code wasn't threaded, the audio and video was choppy. The audio would play a chunk, and then the video would play a chunk.
My solution was to do some multithreading, and let the video stuff do it's work (because libGDX render threads are not threadsafe, and messing with them causes bad things without fail). The natural choice then is to use the threading stuff to do the audio.
This fixes the video choppiness, but not only is the audio still choppy, it's got artifacts all over the place.
This is my first ever stab at serious audio programming, so keep in mind I might not know something basic.The executor service is a SingleThreadExecutor, with the idea that the audio would need to be decoded and written out in-order.
Here's the update method:
public boolean update(float dtSeconds) {
if(playState != PlayState.PLAYING) return false;
long dtMilliseconds = (long)(dtSeconds * 1000);
playTimeMilliseconds += dtMilliseconds;
sleepTimeoutMilliseconds = (long) Math.max(0, sleepTimeoutMilliseconds - dtMilliseconds);
if(sleepTimeoutMilliseconds > 0) {
// The playhead is still ahead of the current frame - do nothing
return false;
}
while(true) {
int packet_read_result = container.readNextPacket(packet);
if(packet_read_result < 0) {
// Got bad packet - we've reached end of the video stream
stop();
return true;
}
if(packet.getStreamIndex() == videoStreamId)
{
// We have a valid packet from our stream
// Allocate a new picture to get the data out of Xuggler
IVideoPicture picture = IVideoPicture.make(
videoCoder.getPixelType(),
videoCoder.getWidth(),
videoCoder.getHeight()
);
// Attempt to read the entire packet
int offset = 0;
while(offset < packet.getSize()) {
// Decode the video, checking for any errors
int bytesDecoded = videoCoder.decodeVideo(picture, packet, offset);
if (bytesDecoded < 0) {
throw new RuntimeException("Got error decoding video");
}
offset += bytesDecoded;
/* Some decoders will consume data in a packet, but will not
* be able to construct a full video picture yet. Therefore
* you should always check if you got a complete picture
* from the decoder
*/
if (picture.isComplete()) {
// We've read the entire packet
IVideoPicture newPic = picture;
// Timestamps are stored in microseconds - convert to milli
long absoluteFrameTimestampMilliseconds = picture.getTimeStamp() / 1000;
long relativeFrameTimestampMilliseconds = (absoluteFrameTimestampMilliseconds - firstTimestampMilliseconds);
long frameTimeDelta = relativeFrameTimestampMilliseconds - playTimeMilliseconds;
if(frameTimeDelta > 0) {
// The video is ahead of the playhead, don't read any more frames until it catches up
sleepTimeoutMilliseconds = frameTimeDelta + sleepTolleranceMilliseconds;
return false;
}
/* If the resampler is not null, that means we didn't get the video in
* BGR24 format and need to convert it into BGR24 format
*/
if (resampler != null) {
// Resample the frame
newPic = IVideoPicture.make(
resampler.getOutputPixelFormat(),
picture.getWidth(), picture.getHeight()
);
if (resampler.resample(newPic, picture) < 0) {
throw new RuntimeException("Could not resample video");
}
}
if (newPic.getPixelType() != IPixelFormat.Type.BGR24) {
throw new RuntimeException("Could not decode video" + " as BGR 24 bit data");
}
// And finally, convert the BGR24 to an Java buffered image
BufferedImage javaImage = Utils.videoPictureToImage(newPic);
// Update the current texture
updateTexture(javaImage);
// Let the caller know the texture has changed
return true;
}
}
}
else if(packet.getStreamIndex() == this.audioStreamId)
{
IAudioSamples samples = IAudioSamples.make(1024, audioCoder.getChannels());
Thread thread = new Thread(new DecodeSoundRunnable(samples));
thread.setPriority(Thread.MAX_PRIORITY);
this.decodeThreadPool.execute(thread);
}
}
Here's the audio thread:
private class DecodeSoundRunnable implements Runnable
{
IAudioSamples samples;
int offset = 0;
IStreamCoder coder;
public DecodeSoundRunnable(IAudioSamples samples)
{
this.samples = samples.copyReference();
this.coder = audioCoder.copyReference();
}
@Override
public void run() {
while(offset < packet.getSize())
{
int bytesDecoded = this.coder.decodeAudio(samples, packet, offset);
if (bytesDecoded < 0)
break;//throw new RuntimeException("got error decoding audio in: " + videoPath);
offset += bytesDecoded;
}
playJavaSound(samples, 0);
//writeOutThreadPool.execute(new WriteOutSoundRunnable(samples, 0));
}
}
Solved this by making a dedicated thread that only writes out the audio data. This works because mLine.write(byte[] bytes) will block while it's writing data.
private class WriteOutSoundBytes implements Runnable
{
byte[] rawByte;
public WriteOutSoundBytes(byte[] rawBytes)
{
rawByte = rawBytes;
}
@Override
public void run()
{
mLine.write(rawByte, 0, rawByte.length);
}
}