c++openglframebufferopenxr

OpenGL: Render Framebuffer into Framebuffer


I am currently working on a c++ program using OpenXR with OpenGL. I managed to render everything that I wanted into VR.

I have one framebuffer per eye, and then 3 more for layers (crosshair/menu) that are displayed with different button presses (which is done in VR with xrEndFrame).

The problem is that I want to mirror everything also onto the computer screen.

Most of it was simple because I just used glBlitNamedFramebuffer to mirror one of the eye framebuffers onto the 0/default framebuffer so it's displayed.

But I can't use this method for the layers, as it is just in 2D on the screen, and I need to position the layer for example to where I am looking in VR for the crosshair.

I can draw a quad to the position i want it to be, but i don't know how i can copy the content of the framebuffer onto the quad. I thought maybe glReadPixels might help, but that seemed rather inefficient.


Solution

  • So I wasn't able to render a framebuffer into another framebuffer, but at least for the OpenXR case, there is a solution.

    I created the layers as usual, but instead of giving the layers to OpenXr via xrEndFrame, I used the color texture from the swapchain

    colorTexture = reinterpret_cast<const XrSwapchainImageOpenGLKHR*>(hudLayer.m_swapchainImages[hudLayer.swapchain_][swapchainImageIndex])->image;
    

    and rendered them onto quads within the first framebuffer. (I could use the positions/orientations of the layers for the quads as well)