canvaswebgl2webxr

How to mirror/clone a WebXR 'immersive-xr' HMD view to the browser


How would one go about mirroring or cloning the WebXR 'immersive-xr' view from a HMD like the VIVE or Oculus in the browser using the same WebGL canvas?

There is much discussion about copying the pixels to a texture2D, then applying that as a render texture, or completely re-drawing the entire scene with an adjusted viewTransform. These work well if you are rendering a different view, such as a remote camera or 3rd person spectator view, however both are a waste of resources if one only wants to mirror the current HMD view on the desktop.

Self answered below as there was no solid answer when I ran into this and I'd like to save future devs the time. (Especially if they're not all to savvy with WebGl2 and WebXR)

Note, that I'm not using any existing frameworks for this project for 'reasons'. It shouldn't change much if you are, you'd just need to perform the steps at the appropriate place in your library's render pipeline.


Solution

  • The answer is delightfully simple as it turns out, and barely hits my fps.

    1. Attach the canvas to the DOM and set it to your desired size. (Mine was fluid, so had a CSS width of 100% of it's parent container with a height of auto)
    2. When you initialize your glContext, be sure to specify that antialiasing is false. This is important if your spectator and HMD views are to be different resolutions.{xrCompatible: true, webgl2: true, antialias: false}
    3. create a frameBuffer that will be used to store your rendered HMD view. spectateBuffer
    4. Draw your immersive-xr layer as usual in your xrSession.requestAnimationFrame(OnXRFrame); callback
    5. Just prior to exiting your OnXRFrame method, implement a call to draw the spectator view. I personally used a bool showCanvas to allow me to toggle the spectator mirror on and off as desired:
    //a quick reference I like to use for enums and types
    const GL = WebGL2RenderingContext;
    
    //Create a buffer for my spectate view so that I can just re-use it at will.
    let spectateBuffer = _glContext.createFramebuffer();
    
    //Called each frame, as per usual
    function OnXRFrame(timestamp, xrFrame){
        //Bind my spectate framebuffer to the webGL2 readbuffer
        _glContext.bindFramebuffer(GL.READ_FRAMEBUFFER, spectateBuffer);
    
        //...Get my pose, update my scene objects
        //...Oh my, a bunch of stuff happens here
        //...finally gl.drawElements(GL.TRIANGLES...
    
        //render spectator canvas
        if(showCanvas){
            DrawSpectator();
        }
    
        //Request next animation callback
        xrFrame.session.requestAnimationFrame(OnXRFrame);
    }
    
    //A tad more verbose that needed to illustrate what's going on.
    //You don't need to declare the src and dest x/y's as their own variables
    function DrawSpectator(){
        //Set the DRAW_FRAMEBUFER to null, this tells the renderer to draw to the canvas.
        _glContext.bindFramebuffer(GL.DRAW_FRAMEBUFFER, null);
    
        //Store last HMD canvas view size (Mine was 0.89:1 aspect, 2296x2552)
        let bufferWidth = _glContext.canvas.width;
        let bufferHeight = _glContext.canvas.height;
    
        //Set canvas view size for the spectator view (Mine was 2:1 aspect, 1280x640)
        _glContext.canvas.width = _glContext.canvas.clientWidth;
        _glContext.canvas.height = _glContext.canvas.clientWidth / 2;
    
        //Define the bounds of the source buffer you want to use
        let srcX0 = 0;
        let srcY0 = bufferHeight * 0.25;    //I crop off the bottom 25% of the HMD's view
        let srcX1 = bufferWidth;
        let srcY1 = bufferHeight - (bufferHeight * 0.25);   //I crop off the top 25% of the HMD's view
    
        //Define the bounds of the output buffer
        let dstY0 = 0;
        let dstX0 = 0;
        let dstY1 = _glContext.canvas.height;
        let dstX1 = _glContext.canvas.width;
    
        //Blit the source buffer to the output buffer
        _glContext.blitFramebuffer(
            srcX0, srcY0, srcX1, srcY1,
            dstX0, dstY0, dstX1, dstY1,
            GL.COLOR_BUFFER_BIT, GL.NEAREST);
    }
    

    Note: I'm only showing one of my HMD eye views as the spectator view, to show both you would need to store a spectator framebuffer per eye and blit them together side by side.

    I hope this save's future googlers some pain.