glslwebgl

WebGL: How to Use Integer Attributes in GLSL


Is it possible to use integer attributes in WebGL vertex shaders? My vertex data contains integers and I want to define the vertex attributes like this:

attribute vec3 POSITION;
attribute int INDEX[8];
attribute float WEIGHT[8];

If this is possible, how would I pass the mixed vertex data to the shader from js using gl.bufferData?

If this is not possible, what is the best way to achieve the same results (pass indices from CPU to shader and use them as integers in the shader)?


Solution

  • WebGL does not support integer attributes. With some exceptions listed in the spec (https://www.khronos.org/registry/webgl/specs/1.0/), WebGL uses mostly the same GLSL feature set as OpenGL ES 2.0.

    The spec document for the OpenGL ES Shading Language 1.00 (which is the GLSL version used for ES 2.0)says on page 30, in section "4.3.3 Attribute":

    The attribute qualifier can be used only with the data types float, vec2, vec3, vec4, mat2, mat3, and mat4. Attribute variables cannot be declared as arrays or structures.

    The best you can do is probably to simply use floats that do not have a fractional part. The range of values that can be represented exactly will be more limited than an integer using the same storage space. For example a 16-bit float (half float) has 11 bits of precision, and can exactly represent integers from 0 to 2048.