javascriptwebsocketaudio-streamingweb-audio-apiaudiobuffer

JavaScript play arraybuffer as audio. Need help to solve "decodeaudiodata unable to decode audio data"


I have a .net core WebSocket server that receives live stream audio from client A, then I need to stream this live audio to client B (Browser). So I've received the byte array from client A, and I sent the byte array to client B (Browser) *The byte array is correct as I can convert it into .wav and play it without a problem.

In client B (Browser), I try to decode the array buffer into the audio buffer so it can be put into output and play.

The mediastreamhandler.SendArraySegToAllAsync is where I start to send out the byte array from the server to client B. I use to send to all method 1st, later will be modified and send out data by matching websocket connection ID.

private async Task Echo(HttpContext context, WebSocket webSocket)
        {
            Debug.WriteLine("Start Echo between Websocket server & client");
            var buffer = new byte[1024 * 4];
           
            WebSocketReceiveResult result = await webSocket.ReceiveAsync(new ArraySegment<byte>(buffer), CancellationToken.None);

            while (!result.CloseStatus.HasValue)
            {
                await webSocket.SendAsync(new ArraySegment<byte>(buffer, 0, result.Count), result.MessageType, result.EndOfMessage, CancellationToken.None);

                result = await webSocket.ReceiveAsync(new ArraySegment<byte>(buffer), CancellationToken.None);

                await mediastreamhandler.SendArraySegToAllAsync(new ArraySegment<byte>(buffer, 0, result.Count));

            }
            Debug.WriteLine("Close Echo");
            await webSocket.CloseAsync(result.CloseStatus.Value, result.CloseStatusDescription, CancellationToken.None);
        }

I then receive the audio byte array through websocket.onmessage in Javascript. Then I pass the byte array to decode and play. But here it says "unable to decode data". While in Mozilla, it did says that content format was unknown (Do I need to reformat the byte array that I receive?) The byte array itself is fine because I've used the same byte to create .wav file locally and play it without any problem.

var ctx = new AudioContext();

    function playSound(arrBuff) {

        var myAudioBuffer;

        var src = ctx.createBufferSource();

        ctx.decodeAudioData(arrBuff, function (buffer) {
            myAudioBuffer = buffer;
        });

        src.buffer = myAudioBuffer;
        src.connect(ctx.destination);
        src.start();

    }

I then try another method to decode and play the audio, this time, it has played out some whitenoise sounds instead of streaming out the audio from Client A.

var ctx = new AudioContext();

    function playSound(arrBuff) {

        var myAudioBuffer;

        var src = ctx.createBufferSource();

        myAudioBuffer = ctx.createBuffer(1, arrBuff.byteLength, 8000);
        var nowBuffering = myAudioBuffer.getChannelData(0);
        for (var i = 0; i < arrBuff.byteLength; i++) {
            nowBuffering[i] = arrBuff[i];
        }

        src.buffer = myAudioBuffer;
        src.connect(ctx.destination);
        src.start();

    }

I think I really need some help over here guys, trying to play out the array buffer in weeks and still, couldn't have any breakthrough. Stuck here. Not sure what I've done, could you guys kindly guide me or tell me any other approach to this? Thanks in advance very much, really mean it.


Solution

  • decodeAudioData() requires complete files, so it can't be used to decode partial chunks of data as they are received from a websocket. If you can stream Opus audio files over your websocket, you can play back with an available WebAssembly decoder. See: