I'm trying to emit webm
Blob
s (Base64-encoded and generated by MediaRecorder
) to my Phoenix backend via WS, where I Base64-decode them and append them to a file. What I end up with is a defective webm video, I can see the first frame when I open it in a player, but it's somewhat pixelated, the colors are off etc. The video length doesn't exist and when I try to diagnose the errors with ffmpeg
, it says:
[h264 @ 0x7fcbd3804200] decode_slice_header error
[h264 @ 0x7fcbd3804200] no frame!
[h264 @ 0x7fcbd3804200] non-existing PPS 0 referenced
(repeated many times)
I'm not sure where I'm making an error (or if I'm making one), here's my short client-side code:
const mediaStream = await navigator.mediaDevices.getUserMedia({ audio:true, video: true });
const options = {
audioBitsPerSecond: 128000,
videoBitsPerSecond: 2500000,
mimeType: 'video/webm'
}
const mediaRecorder = new MediaRecorder(mediaStream,options);
mediaRecorder.ondataavailable = ((e) => {
var reader = new window.FileReader();
reader.readAsDataURL(e.data);
reader.onloadend = (() => {
base64data = reader.result;
topic.push('video_feed', {data: base64data});
})
})
mediaRecorder.start(1500);
My backend Elixir code responsible for file IO and Base64 decoding:
{:ok, io} = File.open("some_file.webm", [:binary, :append]) # This part is triggered once
# Everything below is triggered upon receiving a blob (feed is the base64data object from
# clientside code above)
"data:video/x-matroska;codecs=avc1,opus;base64," <> base64_bit = feed
decode_res = Base.decode64(base64_bit)
case decode_res do
:error -> some_error_handling_code
{:ok, data} -> io |> IO.binwrite(data)
end
{:noreply, state}
Am I messing up the file IO somehow ? I'm suspecting that I'm somehow missing some headers or file metadata which media players require.
Alright, after 2 days of digging and debugging the webm
files, I realized this was a client-side issue (code-related).
There was a race condition where I closed the socket from clientside right after invoking mediaRecorder.stop()
. This was a mistake as I forgot that a final event is always emitted in that case with the remainder of the data (the chunk between the last emission time and the time of stopping the recorder). That event was never sent as I immediately closed my tab programmatically (wise, I know) so my video was malformed. I had some other code-related issues but none this big.
Make sure that you receive that final event, something along the lines of:
topic.push('feed', {data: base64data})
.receive('ok', () => {
// MR state will become inactive as soon as you call .stop() method
if (mediaRecorder.state === 'inactive') {
x.push('that_was_the_final_chunk', {}).receive('ok' => {
nowWeCanSafelyTerminate();
})
}
})
}