Background- I'm working on an project that's a bit like After Effects or Photoshop - you can manipulate various visual layers, which are ultimately composited together. I am not satisfied with the blend modes available directly in WebGL, so I have been implementing new blend modes using shaders and a multipass rendering scheme. Basically, I render each layer into a framebuffer, and then immediately use the framebuffer as a source texture when rendering the next layer.
My problem- since fragment shaders run in parallel, the layers aren't rendered sequentially. The browser attempts to render all layers in parallel, and then composite them together, meaning that the source texture for a given layer's parent layer is empty when the shader executes. As a result, there is no way to composite in data from the prior layer. I can tell that binding and rendering work in a broad sense given that I am able to preserve the composite texture after everything is rendered and use it on a subsequent frame, but that's not what I am looking for.
My question - how can I force WebGL to rasterize a given framebuffer so that I can immediately use it as a texture?
I think the only two APIs that are available to you are flush and finish. From what I remember they don't work exactly as expected in a browser. But you can give it a try.
The WebGLRenderingContext.flush() method of the WebGL API empties different buffer commands, causing all commands to be executed as quickly as possible.
The WebGLRenderingContext.finish() method of the WebGL API blocks execution until all previously called commands are finished.
Edit: If those doesn't do what you expect, then you can probably do a minimal readPixels call which will hold for the pipeline to complete.