I tried create a 16 bit unsigned integer/unsigned short 2D texture in WebGL, but the parameters, although they seem like they fit only lead to error 1282. Here's what I've tried:
let tex = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, tex);
gl.texImage2D(
gl.TEXTURE_2D,
0,
gl.RGBA16UI,
8,
8,
0,
gl.RGBA_INTEGER,
gl.UNSIGNED_INT,
null
);
console.log(gl.getError()); // this prints out an error
I've tried this in an empty p5.js sketch (p5.js is just for setting up WebGL and for its other features and I'm pretty sure that the error is with the parameters rather than any previous code).
I've looked at the official documentation which shows that all of these are indeed valid enums in their corresponding positions excepted for gl.RGBA_INTEGER
(I had tried gl.RGBA
, but that also just gave error 1282).
gl.RGBA_INTEGER
comes from a source that I and many others have found to be generally reliable which is https://webgl2fundamentals.org/webgl/lessons/webgl-data-textures.html and https://webgl2fundamentals.org/webgl/lessons/webgl-readpixels.html. These show
RGBA16UI yes RGBA_INTEGER,UNSIGNED_INT RGBA_INTEGER,UNSIGNED_SHORT
And yes I've tried both gl.UNSIGNED_INT
and gl.UNSIGNED_SHORT
each with both gl.RGBA_INTEGER
and with gl.RGBA
.
I've also tried this with both Chrome and Firefox on decently new versions as well as checked to make sure that these aren't locked behind things like extensions. These are just regular ui16/ushort textures and not floating point textures and appear to be supported with me not having found anything about need some sorta extension for these.
Basically what parameters or changes do I have to make in order to fix error 1282?
The following combination according to the table on webgl fundamentals works fine in Chrome under Linux for me.
const gl = canvas.getContext('webgl2');
const tex = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, tex);
gl.texImage2D(
gl.TEXTURE_2D,
0,
gl.RGBA16UI,
8,
8,
0,
gl.RGBA_INTEGER,
gl.UNSIGNED_SHORT,
null
);
console.log(gl.getError());
<canvas id="canvas"></canvas>