I started developing a (group) video chat application with WebRTC in a SFU configuration and Socket.io as signaling server. The video works as expected, the only problem I'm currently facing is that their is no audio coming back from SFU. I can verify in the chrome logs the audio is send to the SFU but none is received on the other peers.
This is were the tracks are copied from the producer to consumers Full back-end (SFU) code
remoteStream.getTracks().forEach((track) => {
consumer.addTrack(track, remoteStream);
});
This is were the remote stream / tracks are added to the video element. Full front-end (Client) code
if (video) {
remoteStream.getTracks().forEach((track) => {
video.srcObject.addTrack(track);
});
} else {
const video = document.createElement('video');
video.id = `down_${peer.id}`;
video.srcObject = remoteStream;
video.autoplay = true;
video.muted = false;
container.appendChild(video);
}
Outbound audio track (producer)
Inbound audio track (consumer)
The problem was storing the received media stream.
let trackCount = 1;
pc.ontrack = (event) => {
const tracks = event.streams[0].getTracks();
if (!producer.stream) producer.stream = event.streams[0];
else
tracks.forEach((track) => {
producer.stream?.addTrack(track);
});
if (trackCount === tracks.length)
getOthers(room).forEach((producerId) =>
io.to(producerId).emit('producerJoined', socket.id),
);
else trackCount = trackCount + 1;
};
I've updated the code to the following. Long story short, everytime a new track was received it would overwrite the whole media stream object and thus the audio track as of the audio track being the first track. Now I add the tracks one by one. and notify the other members when the last track is added.