I'm using the Screen Capture API and am trying to save the final capture to a video file (WebM, MP4, etc.). I have these two JavaScript functions:
async function startCapture() {
try {
videoElem.srcObject = await navigator.mediaDevices.getDisplayMedia(displayMediaOptions);
} catch(err) {
console.error("Error: " + err);
}
}
function stopCapture() {
let tracks = videoElem.srcObject.getTracks();
tracks.forEach(track => track.stop());
videoElem.srcObject = null;
}
The video is displaying live fine when the capture is started, but I'm not sure how to actually store its contents. videoElem
is a Promise
that resolves to a MediaStream
. tracks
is an array of MediaStreamTrack
objects. This is my first time doing any kind of web development, so I'm a bit lost!
Recording a media element on the MDN docs helped me a ton. Basically, instead of using getUserMedia()
, we use getDisplayMedia()
.