Most resources on the internet claim that after performing calculations in High Dynamic Range, where the values can be arbitrarily large (within type representation limits), the result must be transformed from HDR to Standard Dynamic Range, where the values are in range [0, 1] because this is the range representable by the display device.
Usually a tone-mapper is used, like an ACES style filmic tone-mapper, especially popular in PBR rendering. This is the case because the values would get clamped to the range [0, 1] anyway and anything above 1 would be ignored resulting in neither realistic or pleasing results... or would they?
I recently stumbled across the following NVIDIA article: https://developer.nvidia.com/rendering-game-hdr-display. It seems that by using NVIDIA and Windows specific APIs it is possible to output HDR values to take advantage of the capabilities of certain monitors to display a very large gamut of color and brightness.
I am wondering why isn't this really talked about. As far as I know even large game engines such as Unity and Unreal don't use this technique and instead use a tonemapper to output a SDR image.
Why is this the case? Is HDR output still in the phase of tech demos? Are there no consumer-grade displays capable of displaying it (despite many begin advertised as "HDR monitors")? Or did I understand everything completely wrong and there is nothing such as HDR output and its all just an April Fools joke?
Edit: I would also appreciate some information about a cross platform approach to use HDR output, if possible, especially in OpenGL.
Is HDR output still in the phase of tech demos?
What?? No, it is not. All Blu-rays and AAA games are HDR natively now, including good 10 bit textures in PQ transfer for games.
are there non consumer-grade displays capable of displaying it
LG C9, CX, C1 are such displays, they are almost reference after calibration with internal 3DLUT.
Or did I understand everything completely wrong and there is nothing such as HDR output and its all just an April Fools joke?
HDR will look different on different displays, that is mandated in the standard. That is what tonemapping technology is.
approach to use HDR output, if possible, especially in OpenGL.
Best way to do it is Vulkan and DirectX. But even Linux now has all support. It is more like a driver thing. One needs drivers for this, it is the most complicated part.