graphics3d2dsmalltalkbitblit

Doing something like bitblit on the gpu


I was looking at some talks by Dan Ingalls and he was talking about how they were able to do almost real time 2D graphics way back in the 1970's by a technique called bitblit

This was all done in software and directly on the monitor, is there any reason that techniques like this can't be used on modern gpu hardware?

Is this the way it's done in modern GPU?

I have a high level understanding of the 3D rendering pipeline that's used even for 2D graphics but couldn't some of these old techniques be given a large boost with all that power on a GPU?


Solution

  • Bitblitting circuitry has been a vital component of any kind of graphics chip (be it a dumb framebuffer interface or a GPU with computational capabilities) for the past 20 years.

    One can find blitting operations for inter-framebuffer transfers in even the most modern 3D rendering APIs.

    OpenGL has glBlitFramebuffer Vulkan has vkImageBlit Metal has Blit Command Encoder and Direct3D has it too somewhere in the Surface interface, but I don't know the exact name for it.

    but couldn't some of these old techniques be given a large boost with all that power on a GPU?

    These techniques are hardwired circuitry these days. It's impossible to make them more efficient.