I have a POST
endpoint that returns an MP3 stream, and I want to play it in JavaScript using the native browser fetch API. The goal is to implement a cross-browser solution, but so far, nothing works in Firefox. Since I need to use a POST
endpoint, using new Audio(url)
isn’t an option. To make this easier to reproduce, I will use a GET
endpoint for a random radio station that streams MP3, but the issue remains the same.
What I’ve tried:
MediaSource
const playMP3Stream = async () => {
const audio = new Audio();
const mediaSource = new MediaSource();
audio.src = URL.createObjectURL(mediaSource);
mediaSource.onsourceopen = async () => {
const sourceBuffer = mediaSource.addSourceBuffer("audio/mpeg");
const response = await fetch("http://sc6.radiocaroline.net:8040/mp3");
if (!response.ok) {
return;
}
const reader = response.body?.getReader();
if (!reader) return;
while (mediaSource.readyState === "open") {
const { value, done } = await reader.read();
if (done) {
mediaSource.endOfStream();
break;
}
sourceBuffer.appendBuffer(value);
}
};
await audio.play();
};
AudioContext
const playMP3Stream = async () => {
const audioContext = new (window.AudioContext || window.webkitAudioContext)();
const response = await fetch('http://sc6.radiocaroline.net:8040/mp3');
const reader = response.body.getReader();
let streamBuffer = new Uint8Array();
let startTime = audioContext.currentTime;
async function processChunk() {
const { done, value } = await reader.read();
if (done) {
return;
}
try {
const newBuffer = new Uint8Array(streamBuffer.length + value.length);
newBuffer.set(streamBuffer);
newBuffer.set(value, streamBuffer.length);
streamBuffer = newBuffer;
if (streamBuffer.length > 10000) {
const audioBuffer = await audioContext.decodeAudioData(
streamBuffer.buffer,
);
const source = audioContext.createBufferSource();
source.buffer = audioBuffer;
source.connect(audioContext.destination);
source.start(startTime);
startTime += audioBuffer.duration;
streamBuffer = new Uint8Array();
}
} catch (error) {
console.error('Error decoding audio chunk:', error);
}
processChunk();
}
processChunk();
}
MediaSource
works well in Chromium browsers, but unfortunately, the "audio/mpeg"
MIME type is not supported in Firefox. The AudioContext
approach works to some extent, but there are gaps between chunks, and it still only functions in Chromium browsers. I’m not sure if this is particularly difficult, or if I’m just struggling with the implementation, but I can't figure out how to play an MP3 audio stream in Firefox.
I would greatly appreciate any help, whether it's a working example of gapless MP3 streaming in Firefox or anything else that could help me make this work.
Thanks in advance!
MediaSource
does not support MP3 in Firefox, and AudioContext
is not suitable for continuous MP3 streaming
.
The best solution to this issue could be an instance of the Audio
class in JS
.
A POST
request complicates playback because browser-based players (<audio>
, new Audio()
) only work with GET
— they expect a simple, directly accessible URL. Even if the response includes Content-Type: audio/mpeg
, the browser player does not allow direct streaming via POST
.
Therefore, the best approach is for the server to handle the POST
request, then make a GET
request to the actual stream and forward it to the client. This way, <audio>
can receive the stream via GET
, ensuring a cross-browser solution.
in html:
<audio controls autoplay>
<source src="http://localhost:3000/proxy-stream" type="audio/mpeg">
ohh hell no, your browser does not support the audio element #sad
</audio>
or in js:
const audio = new Audio("your-site/proxy-stream");
audio.play();
"your-site/proxy-stream" is an API endpoint that acts as a proxy server. It is not a direct audio file but a server route that receives a request and returns an MP3 stream.
On the client side, we can access this endpoint using <audio>
or new Audio()
, while the server fetches the actual MP3 stream (http://mp3.some-real-stream.net:8040/mp3) and forwards it back to the client side.