javascriptsocket.iowebrtchtml5-audioaudio-streaming

MediaRecorder not Initializing Correctly for Real-time Audio Broadcasting


I'm working on a project where I need to implement real-time audio broadcasting from an admin to multiple users. While researching, I found examples using WebRTC and Socket.IO My current setup involves an admin streaming audio from their microphone to a server, which then broadcasts the audio to connected users in real-time. However, I'm encountering issues with the WebRTC and Socket.IO approach, and I'm open to exploring other options too.

Here's a version of the client-side code I'm using for the admin:

<!DOCTYPE html>
<html lang="en">
<head>
    <meta charset="UTF-8">
    <meta name="viewport" content="width=device-width, initial-scale=1.0">
    <title>Admin Broadcasting</title>
</head>
<body>
    <h1>Admin Broadcasting</h1>
    <button id="startBroadcastBtn">Start Broadcasting</button>

    <script src="/socket.io/socket.io.js"></script>
    <script>
        const startBroadcastBtn = document.getElementById('startBroadcastBtn');
        const socket = io('ws://localhost:9000');
        let mediaRecorder;

        startBroadcastBtn.addEventListener('click', async () => {
            try {
                const stream = await navigator.mediaDevices.getUserMedia({ audio: true });
                console.log('MediaStream object:', stream);
                const audioTracks = stream.getAudioTracks();
                console.log('Audio tracks:', audioTracks);
                if (audioTracks.length > 0) {
                    console.log('Audio track inside stream:', audioTracks[0]);
                    mediaRecorder = new MediaRecorder(stream);
                    console.log('MediaRecorder object:', mediaRecorder);
                    mediaRecorder.ondataavailable = (event) => {
                        console.log('Audio chunk captured:', event);
                        if (event.data.size > 0) {
                          console.log('Audio chunk captured:', event.data);
                          socket.emit('admin-audio-chunk', event.data);
                        }
                    };
                    mediaRecorder.start();
                } else {
                    console.error('No audio tracks available');
                }
            } catch (error) {
                console.error('Error accessing media devices:', error);
            }
        });
    </script>
</body>
</html>

On the server side, I'm using Node.js with Socket.IO for handling client connections and broadcasting audio streams.

The MediaRecorder object seems to be created, but the ondataavailable event is not triggering when audio chunks are available. As a result, no audio data is being sent to the server for broadcasting.

I've checked the browser compatibility and permissions for accessing the microphone, but I'm still unable to figure out why the MediaRecorder object is not initializing correctly.

Can anyone help me identify what might be causing this issue and how to resolve it? Any suggestions or insights would be greatly appreciated.

Thank you!


Solution

  • To get the MediaRecorder to emit regular chunks of data, you have to give it an interval when you call .start().

    mediaRecorder.start(1_000); // Emit data roughly every second
    

    See also: https://developer.mozilla.org/en-US/docs/Web/API/MediaRecorder/start