I've followed this example and created a custom AudioWorkletProcessor which works as expected. What I'd like to do now is to stream MP3 audio from my server (I'm currently using Python/Flask) into it.
So, for example
const response = await fetch(url);
const reader = response.body.getReader();
while (true) {
const {value, done} = await reader.read();
if (done) break;
// do something with value
}
which gives me a Uint8Array
. How do I pass its content to the AudioWorklet instead of the current channel[i] = Math.random() * 2 - 1;
?
Thank you :)
Firstly, MP3 is a compressed audio file format but the Web Audio API nodes only work with uncompressed sample data. You'll need to use the decodeAudioData()
method of the AudioContext
object to convert the bytes of the MP3 file into an AudioBuffer
object.
Secondly, decodeAudioData()
isn't really designed for streaming but because you're using MP3 you're in luck. See Encoding fails when I fetch audio content partially for more information.
Thirdly, the AudioContext
object isn't accessible from inside an AudioWorkletProcessor
, so you'll have to call decodeAudioData()
from the main thread and then pass the decompressed data from your AudioWorkletNode
to your AudioWorkletProcessor
using their respective message ports, which are accessible from the port
property of each object.
Fourthly, AudioBuffer
isn't one of the allowed types that can be sent through a message port using postMessage()
. Fortunately the Float32Array
returned by the buffer's getChannelData()
method is one of the supported types.
I'm not sure what your reason is for using an audio worklet. Depends on what you want to do with the MP3 but if all you want to do is play it then there are simpler solutions that involve lower CPU usage.