javascriptsafariwebglantialiasingglreadpixels

antialias=false hinder readPixels with Safari on MacOS


I use "readPixels()" for reading pixel color on webgl-canvas. I cannot read pixel when antialise set to false for getContext option. I can read with below line on Safari/MacOS, Chrome, Firefox and Safari/iOS.

const gl = canvas.getContext('webgl', {'preserveDrawingBuffer':true, 'antialias':true});

But I cannot read with below on Safari/MacOS, but can read on Chrome, Firefox and Safari/iOS.

const gl = canvas.getContext('webgl', {'preserveDrawingBuffer':true, 'antialias':false});

ADD LINE: console output Uint8Array [0, 0, 0, 0], but the pixel has color.

Is there a problem with Safari/MacOS or Do I need any option for Safari/MacOS ? Can I read pixel color with Safari/MacOS with antialias=false ?

Sorry for my poor English, thank you.


Solution

  • Post a minimal repo in a snippet if you want help debugging

    This works for me in Safari on Mac and iPhone

    function test(antialias) {
      const gl = document.createElement('canvas').getContext('webgl', {
        preserveDrawingBuffer: true,
        antialias,
      });
      gl.clearColor(1, 0.5, 0.25, 1);
      gl.clear(gl.COLOR_BUFFER_BIT);
      const pixel = new Uint8Array(4);
      gl.readPixels(0, 0, 1, 1, gl.RGBA, gl.UNSIGNED_BYTE, pixel);
      log(JSON.stringify(gl.getContextAttributes(), null, 2));
      log('pixel:', pixel);
    }
    
    function log(...args) {
      const elem = document.createElement('pre');
      elem.textContent = args.join(' ');
      document.body.appendChild(elem);
    }
    
    test(false);
    test(true);