I'm noob at WebGL. I read in several posts of ND-Buffers and G-Buffers as if it were a strategic choice for WebGL development.
How are ND-Buffers and G-Buffers related to rendering pipelines? Are ND-Buffers used only in forward-rendering and G-Buffers only in deferred-rendering?
A JavaScript code example how to implement both would be useful for me to understand the difference.
G-Buffers are just a set of buffers generally used in deferred rendering.
Wikipedia gives a good example of the kind of data often found in a g-buffer
Diffuse color info
World space or screen space normals
Depth buffer / Z-Buffer
The combination of those 3 buffers is referred to as a "g-buffer"
Generating those 3 buffers from geometry and material data you can then run a shader to combine them to generate the final image.
What actually goes into a g-buffer is up to the particular engine/renderer. For example one of Unity3D's deferred renders contains diffuse color, occlusion, specular color, roughness, normal, depth, stencil, emission, lighting, lightmap, reflection probs.
An ND buffer just stands for "normal depth buffer" which makes it a subset of what's usually found in a typical g-buffer.
As for a sample that's arguably too big for SO but there's an article about deferred rendering in WebGL on MDN