javascripttextrenderingwebgpu

WebGPU Text Rendering


How can I render text using pure WebGPU without third party libraries if possible?

I am still in the process of learning WebGPU and don't know much, but I found this solution, but it uses third-party libraries

/* Canvas initialization */

const canvas = document.getElementById("canvas") as
HTMLCanvasElement;

/* WebGPU initialization */

const adapter: GPUAdapter = await navigator.gpu.
requestAdapter() as GPUAdapter;
const device: GPUDevice = await adapter.requestDevice() as
  GPUDevice;
}

/* Font initialization */

const fontParser = new FontParser(device, "fonts/RobotoRegular.ttf");

/* Text block to render */

const textBlock = new TextBlock(
    device, "Hello World", fontParser,
    options = {
    color = [0.6, 0.5. 0.6, 1.0],
    spacing = 2,
    width = 2,
    size = 4,
    isWinding = false
});

/* 1) Initialization */

const renderer = new Renderer(device, canvas, textBlock,
projectionMatrix, viewMatrix, color=[0.2, 0.0, 0.0,1.0]);

/* 2) Preparing data */

const perFrameData = this.renderer.prepare();

/* 3) Rendering */

this.renderer.render(perFrameData);

I got it from here https://is.muni.cz/th/nl4qv/Bachelor_s_Thesis.pdf


Solution

  • WebGPU doesn't render text directly.

    Common ways to render text with the GPU, regardless of API, are

    1. Put text in a texture, render the texture as a single quad

    2. Put glyphs in a texture atlas (a texture with lots if different images in it). Render quads, setting the texture coordinates to select the glyphs you want to see. How you make the texture atlas of glyphs is up to you. You can pre-create it offline. Create it using the 2D canvas API at runtime. Write font rendering code in JavaScript to read a font file and render out the glyphs.

      Part of this depends on the use case. Systems like the OS and the browser need to be able to display all of unicode in multiple fonts and sizes as well as emoji so usually they use a cache of textures and render new glyphs as needed into them (and discard unused glyphs)

    3. Generate data for SDF (Signed Distance Fields) or similar, usually from a font file, and use that data to procedurally generate the output of a fragment shader for each glyph quad. example

    3 is way beyond what can be explained easily and is also relatively uncommon. 2 and 1 are the most common, in that order and 2 is the technique used the draw the text in this page in the browser, at least as of 2023

    Also, be aware that rendering text can be very hard if you want to be inclusive and support all languages. If you only care about ASCII or some other small subset then it's relatively easy.

    <canvas></canvas>
      
    <script type="module">
    // WebGPU Simple Textured Quad - Import Canvas
    // from https://webgpufundamentals.org/webgpu/webgpu-simple-textured-quad-import-canvas.html
    
    
    import {mat4} from 'https://webgpufundamentals.org/3rdparty/wgpu-matrix.module.js';
    
    const glyphWidth = 32;
    const glyphHeight = 40;
    const glyphsAcrossTexture = 16;
    function genreateGlyphTextureAtlas() {
      const ctx = document.createElement('canvas').getContext('2d');
      ctx.canvas.width = 512;
      ctx.canvas.height = 256;
    
      let x = 0;
      let y = 0;
      ctx.font = '32px monospace';
      ctx.textBaseline = 'middle';
      ctx.textAlign = 'center';
      ctx.fillStyle = 'white';
      for (let c = 33; c < 128; ++c) {
        ctx.fillText(String.fromCodePoint(c), x + glyphWidth / 2, y + glyphHeight / 2);
        x += glyphWidth;
        if (x >= ctx.canvas.width) {
          x = 0;
          y += glyphHeight;
        }
      }
    
      return ctx.canvas;
    }
    
    async function main() {
      const adapter = await navigator.gpu?.requestAdapter();
      const device = await adapter?.requestDevice();
      if (!device) {
        fail('need a browser that supports WebGPU');
        return;
      }
    
      // Get a WebGPU context from the canvas and configure it
      const canvas = document.querySelector('canvas');
      const context = canvas.getContext('webgpu');
      const presentationFormat = navigator.gpu.getPreferredCanvasFormat();
      context.configure({
        device,
        format: presentationFormat,
      });
    
      const module = device.createShaderModule({
        label: 'our hardcoded textured quad shaders',
        code: `
          struct VSInput {
            @location(0) position: vec4f,
            @location(1) texcoord: vec2f,
            @location(2) color: vec4f,
          };
    
          struct VSOutput {
            @builtin(position) position: vec4f,
            @location(0) texcoord: vec2f,
            @location(1) color: vec4f,
          };
    
          struct Uniforms {
            matrix: mat4x4f,
          };
    
          @group(0) @binding(2) var<uniform> uni: Uniforms;
    
          @vertex fn vs(vin: VSInput) -> VSOutput {
            var vsOutput: VSOutput;
            vsOutput.position = uni.matrix * vin.position;
            vsOutput.texcoord = vin.texcoord;
            vsOutput.color = vin.color;
            return vsOutput;
          }
    
          @group(0) @binding(0) var ourSampler: sampler;
          @group(0) @binding(1) var ourTexture: texture_2d<f32>;
    
          @fragment fn fs(fsInput: VSOutput) -> @location(0) vec4f {
            return textureSample(ourTexture, ourSampler, fsInput.texcoord) * fsInput.color;
          }
        `,
      });
    
      const glyphCanvas = genreateGlyphTextureAtlas();
      // so we can see it
      document.body.appendChild(glyphCanvas);
      glyphCanvas.style.backgroundColor = '#222';
    
      const maxGlyphs = 100;
      const floatsPerVertex = 2 + 2 + 4; // 2(pos) + 2(texcoord) + 4(color)
      const vertexSize = floatsPerVertex * 4; // 4 bytes each float
      const vertsPerGlyph = 6;
      const vertexBufferSize = maxGlyphs * vertsPerGlyph * vertexSize;
      const vertexBuffer = device.createBuffer({
        label: 'vertices',
        size: vertexBufferSize,
        usage: GPUBufferUsage.VERTEX | GPUBufferUsage.COPY_DST,
      });
      const indexBuffer = device.createBuffer({
        label: 'indices',
        size: maxGlyphs * vertsPerGlyph * 4,
        usage: GPUBufferUsage.INDEX | GPUBufferUsage.COPY_DST,
      });
      // pre fill index buffer with quad indices
      {
        const indices = [];
        for (let i = 0; i < maxGlyphs; ++i) {
          const ndx = i * 4;
          indices.push(ndx, ndx + 1, ndx + 2, ndx + 2, ndx + 1, ndx + 3);
        }
        device.queue.writeBuffer(indexBuffer, 0, new Uint32Array(indices));
      }
    
      function generateGlyphVerticesForText(s, colors = [[1, 1, 1, 1]]) {
        const vertexData = new Float32Array(maxGlyphs * floatsPerVertex * vertsPerGlyph);
        const glyphUVWidth = glyphWidth / glyphCanvas.width;
        const glyphUVheight = glyphHeight / glyphCanvas.height;
        let offset = 0;
        let x0 = 0;
        let x1 = 1;
        let y0 = 0;
        let y1 = 1;
        let width = 0;
    
        const addVertex = (x, y, u, v, r, g, b, a) => {
          vertexData[offset++] = x;
          vertexData[offset++] = y;
          vertexData[offset++] = u;
          vertexData[offset++] = v;
          vertexData[offset++] = r;
          vertexData[offset++] = g;
          vertexData[offset++] = b;
          vertexData[offset++] = a;
        };
    
        const spacing = 0.55;
        let colorNdx = 0;
        for (let i = 0; i < s.length; ++i) {
          // convert char code to texcoords for glyph texture
          const c = s.charCodeAt(i);
          if (c >= 33) {
            const cNdx = c - 33;
            const glyphX = cNdx % glyphsAcrossTexture;
            const glyphY = Math.floor(cNdx / glyphsAcrossTexture);
            const u0 = (glyphX * glyphWidth) / glyphCanvas.width;
            const v1 = (glyphY * glyphHeight) / glyphCanvas.height;
            const u1 = u0 + glyphUVWidth;
            const v0 = v1 + glyphUVheight;
            width = Math.max(x1, width);
    
            addVertex(x0, y0, u0, v0, ...colors[colorNdx]);
            addVertex(x1, y0, u1, v0, ...colors[colorNdx]);
            addVertex(x0, y1, u0, v1, ...colors[colorNdx]);
            addVertex(x1, y1, u1, v1, ...colors[colorNdx]);
          } else {
            colorNdx  = (colorNdx + 1) % colors.length;
            if (c === 10) {
              x0 = 0;
              x1 = 1;
              y0 = y0 - 1;
              y1 = y0 + 1;
              continue;
            }
          }
          x0 = x0 + spacing;
          x1 = x0 + 1;
        }
    
        return {
          vertexData,
          numGlyphs: offset / floatsPerVertex,
          width,
          height: y1,
        };
      }
    
      const { vertexData, numGlyphs, width, height } = generateGlyphVerticesForText(
        'Hello\nworld!\nText in\nWebGPU!', [
           [1, 1, 0, 1],
           [0, 1, 1, 1],
           [1, 0, 1, 1],
           [1, 0, 0, 1],
           [0, .5, 1, 1],
         ]);
      device.queue.writeBuffer(vertexBuffer, 0, vertexData);
    
      const pipeline = device.createRenderPipeline({
        label: 'hardcoded textured quad pipeline',
        layout: 'auto',
        vertex: {
          module,
          entryPoint: 'vs',
          buffers: [
            {
              arrayStride: vertexSize,
              attributes: [
                {shaderLocation: 0, offset:  0, format: 'float32x2'},  // position
                {shaderLocation: 1, offset:  8, format: 'float32x2'},  // texcoord
                {shaderLocation: 2, offset: 16, format: 'float32x4'},  // color
              ],
            },
          ],
        },
        fragment: {
          module,
          entryPoint: 'fs',
          targets: [
            {
              format: presentationFormat,
              blend: {
                color: {
                  srcFactor: 'one',
                  dstFactor: 'one-minus-src-alpha',
                  operation: 'add',
                },
                alpha: {
                  srcFactor: 'one',
                  dstFactor: 'one-minus-src-alpha',
                  operation: 'add',
                },
              },
            },
          ],
        },
      });
    
      function copySourceToTexture(device, texture, source, {flipY} = {}) {
        device.queue.copyExternalImageToTexture(
          { source, flipY, },
          { texture, premultipliedAlpha: true },
          { width: source.width, height: source.height },
        );
      }
    
      function createTextureFromSource(device, source, options = {}) {
        const texture = device.createTexture({
          format: 'rgba8unorm',
          size: [source.width, source.height],
          usage: GPUTextureUsage.TEXTURE_BINDING |
                 GPUTextureUsage.COPY_DST |
                 GPUTextureUsage.RENDER_ATTACHMENT,
        });
        copySourceToTexture(device, texture, source, options);
        return texture;
      }
    
      const texture = createTextureFromSource(device, glyphCanvas, {mips: true});
      const sampler = device.createSampler({
         minFilter: 'linear',
         magFilter: 'linear',
      });
    
      // create a buffer for the uniform values
      const uniformBufferSize =
        16 * 4; // matrix is 16 32bit floats (4bytes each)
      const uniformBuffer = device.createBuffer({
        label: 'uniforms for quad',
        size: uniformBufferSize,
        usage: GPUBufferUsage.UNIFORM | GPUBufferUsage.COPY_DST,
      });
    
      // create a typedarray to hold the values for the uniforms in JavaScript
      const kMatrixOffset = 0;
      const uniformValues = new Float32Array(uniformBufferSize / 4);
      const matrix = uniformValues.subarray(kMatrixOffset, 16);
    
      const bindGroup = device.createBindGroup({
        layout: pipeline.getBindGroupLayout(0),
        entries: [
          { binding: 0, resource: sampler },
          { binding: 1, resource: texture.createView() },
          { binding: 2, resource: { buffer: uniformBuffer }},
        ],
      });
    
      const renderPassDescriptor = {
        label: 'our basic canvas renderPass',
        colorAttachments: [
          {
            // view: <- to be filled out when we render
            clearValue: [0.3, 0.3, 0.3, 1],
            loadOp: 'clear',
            storeOp: 'store',
          },
        ],
      };
    
      function render(time) {
        time *= 0.001;
    
        const fov = 60 * Math.PI / 180;  // 60 degrees in radians
        const aspect = canvas.clientWidth / canvas.clientHeight;
        const zNear  = 0.001;
        const zFar   = 50;
        const projectionMatrix = mat4.perspective(fov, aspect, zNear, zFar);
    
        const cameraPosition = [0, 0, 5];
        const up = [0, 1, 0];
        const target = [0, 0, 0];
        const viewMatrix = mat4.lookAt(cameraPosition, target, up);
        const viewProjectionMatrix = mat4.multiply(projectionMatrix, viewMatrix);
    
        // Get the current texture from the canvas context and
        // set it as the texture to render to.
        renderPassDescriptor.colorAttachments[0].view =
            context.getCurrentTexture().createView();
    
        const encoder = device.createCommandEncoder({
          label: 'render quad encoder',
        });
        const pass = encoder.beginRenderPass(renderPassDescriptor);
        pass.setPipeline(pipeline);
    
        mat4.rotateY(viewProjectionMatrix, time, matrix);
        mat4.translate(matrix, [-width / 2, -height / 2, 0], matrix);
    
        // copy the values from JavaScript to the GPU
        device.queue.writeBuffer(uniformBuffer, 0, uniformValues);
    
        pass.setBindGroup(0, bindGroup);
        pass.setVertexBuffer(0, vertexBuffer);
        pass.setIndexBuffer(indexBuffer, 'uint32');
        pass.drawIndexed(numGlyphs * 6);
    
        pass.end();
    
        const commandBuffer = encoder.finish();
        device.queue.submit([commandBuffer]);
    
        requestAnimationFrame(render);
      }
      requestAnimationFrame(render);
    }
    
    function fail(msg) {
      // eslint-disable-next-line no-alert
      alert(msg);
    }
    
    main();
      
    </script>

    It's low-res and blocky because

    1. We made small glyphs
    2. We set filtering to nearest in the sampler (the default)
    3. We didn't generate mipmaps (see this article)

    More things you'll need to handle is stuff like

    and all stuff mentioned in the article linked above about how text is hard

    Note that arguably, if you can, you should use HTML to render text in the browser. It's the easiest way. In other words, consider if you actually need the text in 3d. For example, often text in games is in the stats around the edges, or in the chat area in an online game. All of that is much easier to do in HTML. As example, all the text in this game could have been HTML using techniques like the ones on this page because all of it is above the 3D.

    disclosure: I contributed to the site linked for the sdf example above and to the site that hosts the article on mipmaps and to three.js so according to s.o., putting those links here is self promotion