Rendering Panda on an HDR monitor

HDR rendering techniques in games have become very commonplace. To achieve realistic lighting, one sets the brightness of lights in-game to realistically proportioned values, and then uses a post-processing tone-mapping shader to map it back down to the standard 0-255 sRGB range of the monitor.

The only limitation of this technique is that the result is tonemapped down to that narrow 8-bit dynamic range before it reaches our monitor (and by extension, our eyes). The sRGB gamma curve helps a tiny bit to add a bit more range to dark tones, but not much. However, there are now some TVs on the market that are capable of much wider dynamic ranges, and can represent the brightness of specular highlights far more accurately and impressively. Though good HDR monitors capable of a truly interesting dynamic range are rare and expensive (upwards of $1000), their price has been decreasing and will likely to decrease even more in the future.

I’ve had some luck getting Panda to work with this, so I just wanted to share my findings in case others wanted to experiment with it as well.

What do you need:

  • Windows 10, with HDR mode switched on. Sadly, other operating systems just aren’t ready yet.
  • A TV or monitor supporting SMPTE 2084, often labelled HDR10. Now, there are many monitors that claim to be HDR but don’t go above 400 nits of brightness, and I don’t really think this is worth it; you really need a DisplayHDR1000 TV or monitor to be able to see a significant effect.
  • A game that already uses HDR rendering (panda3d-simplepbr will do), with bright lights.
  • NVIDIA Pascal GPU (Maxwell may also work). I have not tested AMD.
  • Latest Panda master.

Now, to enable HDR output in Panda, you need to do this: (requires latest Panda master)

framebuffer-float true

When you do this, you will immediately notice two things:

  1. The scene becomes dark.
  2. The colours will look washed out.

The reason for (1) is that the colours are now using absolute brightness, instead of relative brightness. Instead of the value 1.0 meaning “the brightest that the monitor can go”, it means “80 nits” (1 nit = 1 cd/m2). (Some monitors will even disable brightness settings in HDR mode.) Fortunately, the framebuffer values are no longer capped to 1.0. If your monitor goes up to 1000 nits, as a decent HDR monitor will, then you can write a value of 1000 / 80 = 12.5 to the framebuffer to get the full brightness. You will need to adjust your tonemapper to take this into account.

The reason for (2) is that the framebuffer output is no longer expected to be using the sRGB gamma curve, but is expected to be linear (aka scRGB). If you were using framebuffer-srgb true, simply remove it. If you had a custom gamma correction in your shader, simply remove it. If you were not using a gamma-correct pipeline to begin with, well, it’s no longer optional: set your input color textures to the sRGB format.

Here are some resources to get started with this subject:



It would be good to think about ways we can build a good Panda API around this. We probably will need an interface to communicate about the color space of the framebuffer (sRGB vs scRGB vs BT.2020 etc.) We will also want to create an interface to send HDR metadata, such as reference brightness and primaries, to the monitor (right now this is only possible through proprietary AMD/NVIDIA APIs). Food for thought!

4 Likes

Maybe my non-gamedev knowledge will help here a little,
I have to disagree with the 1000 Nits requirement for appreciating HDR10 or higher dynamic range in general. Don’t put this feature in low priority because of this assumption.
I can verify from thousands of hours of experiments with video projectors and OLED panels used in some modern VR headsets that 30bit RGB, even not real HDR10, is clearly noticeable difference from 24bit even on a 200 Nit OLED or 600 lumen video projectors. Specifically with DLP the chipset does some proprietary image processing so even just a 24bit input converted to 30bit appears to have a lot higher dynamic range even with the abysmal 400:1 to 800:1 DLP contrast ratio.
There are few factors making relatively dim but HDR modern video projectors or display panels work, the main one being Helmholtz–Kohlrausch effect
and use of narrow-band RGB LED or laser sources instead of white light bulbs or backlights. Desired Nit as well as lumens are also highly dependent on amount of ambient light, more than 200 Nits are only desired and also not causing eye strain in presence of considerable ambient light. Some phone display panels go above 1200 Nits only because that level of brightness makes the screen more readable under direct sunlight, in pitch black above 50 Nits is not recommended and is also what the non-HDR recommended value is for movie theatre screens, with HDR recommendation not being much closer to 1000 and being 200 if I recall correctly.
The brightest consumer VR headset display panel is still the original HTC Vive but itself is still 213 measured Nits and being a bit too bright for many users, with the more modern Valve Index being 95 and Ouclus RIft S being 74 Nits. While these aren’t HDR, the point is 213 is plenty bright here, let alone 1000.
HDR-ready TVs have dropped conisderably in price, not really $1000 and while you may not enjoy HDR to the fullest with a lot of ambient light, if you want to appreciate HDR10 to the fullest you should be controlling your ambient light anyway even with a 1000Nit+ monitor, otherwise even specular reflections on the display are going to reduce the quality of your image and antiglare filters only go so far.
TV and smartphone manufacturers make it seem like higher Nits is not only better but necessary since it the easier feature of the product for them to improve but it is mostly just marketing like the megapixel (passing the diffraction limit) and resoluton war past 1080p for 5.5’’ (above retinal resolution) was for smartphones.
In summary Nits doesn’t matter that much for higher dynamic range displays as is claimed.

Thanks for the additional information. Yes, ambient lighting is very important.

In my limited experience, where the HDR effect is most impressive is in the relative brightness of bright light sources, specular highlights, and particle effects, not in just having the whole screen be generally bright. Having a screen fully lit to 200 nits in a completely dark environment (including inside a VR headset) might already be uncomfortably bright, but the ability to display good fidelity in the 0-80 range while at the same time showing specular highlights and other bright spots in a great multitude of that range is what sold me on HDR. Perhaps I could approach that same effect on my DisplayHDR400 monitor if I managed to completely darken out my room, though unfortunately the monitor I got has a really poorly calibrated gamma curve.

There are indeed also benefits to be had from 30-bit in general, such as reduced banding, though that doesn’t really require an HDR framebuffer, and I can get that on Linux too.

Sure, HDR + more Nits is more impressive, my point is just that a lot more people will benefit from HDR, not just people with 1000Nit HDR monitors.
The 50 Nit figure I got for digital cinema projectors is from the Society of Motion Picture and Television Engineers (paper ST 431-1:2006). It’s actually 48, I just round it to 50. Texas Instruments (developers of DLP) confirms this in one of their own dev documents and further lists different Nits values for different ambient light conditions:


As you can see by controlling the ambient light can make ~10x dimmer image appear as bright.

Society of Motion Picture and Television Engineers (SMPTE) is the same organization that recommends 1000 Nits for LCD HDR TVs and 530 Nits for OLEDs (I’m guessing the latter is because OLEDs blacks are darker). Sadly I can’t find the recommended Nits for HDR video projectors but I think it was somewhere in the 200-400 Nits range.
I think the above info, including the different Nits requirements based on ambient light, should be convincing enough on how much HDR will be noticeable with okay-ish monitors and TVs as well.

Regarding 30-bit non-HDR video, I can’t tell if that will be useful. As far as I know current generation DLP pcio chipsets (home cinema use, around 2000 lumens max) don’t support the HDR requirements for the deep blacks, but the HDR signal they do recieve is still clearly better dynamic range than SDR and I think internally the chip takes the HDR signal and maps it differently, more like just a basic 30 bit RGB signal. But at the end of the day even these DLP projectors take as input an actual HDR video signal. I don’t know if they will properly accept and display a non-HDR 30bit video, if those exist.