Wednesday, November 16, 2022

HDR issues

With HDR as a new monitor technology, one thing I've found consistently is how inconsistent it is. Read on as I go into detail on what I've observed with HDR in recent times.



HDR stands for High Dynamic Range (vs. SDR, or Standard Dynamic Range). There's all sorts of talk about more colors, or brighter displays, but what is it truly? And what is it supposed to do?

First, let's go into SDR. Your monitor, phone, TV, etc. has a screen which is made up of pixels- often LEDs (Light Emitting Diodes), sometimes with LCD (Liquid Crystal Display) or other tech behind it (QLED, OLED, AMOLED, Plasma). Pretty much, all these display types feature some sort of grid of pixels, which are made up of red (R), green (G), and blue (B) light (sometimes also including a white light, like with smart bulbs; other times, lacking in one or more of these lights, like for green-black terminal displays). To display a wide variety of colors, the R, G, and B lights are adjusted in brightness, which in turn forms the colors which are visible to human eyes from a display. To represent the R, G, and B values, a number is used. This is where SDR comes in- an SDR display displays color values from 0 to 255, or as 8 bit colors (2^8 is 256, so each of the R, G, and B values can be stored in one byte (8 bits), the smallest unit of memory which most operating systems and CPU architectures can reference). This produces over 16 million colors!!!

However, the human eye can see far more than just 16 million colors, and with SDR, there is a limit to how much of a range of colors can be produced by displays. Consider the color white, or RGB hexadecimal value #FFFFFF (hexadecimal is a counting system from 0 to 15, rather than 10 like decimal; 0xFF in hexadecimal is equivelant to 255 in decimal, so 0xFFFFFF in hexadecimal is the same as 255, 255, 255 in decimal).

On my phone, this white value is super bright- let's say we can measure the value as 5000 nits (a unit of measuring brightness). However, on my smartwatch, the same #FFFFFF shows up as only 500 nits, because my watch saves energy. Also, black, or #000000, shows up as a completely turned off pixel, no matter what, with the same brightness on my watch and phone. However, thanks to the limited SDR range, my phone can't display any more colors than my watch (in SDR mode), despite being able to produce a brighter looking RGB value. Below is an HTML gradient which can demonstrate an issue with the limited color range.

On my phone (SDR):


On my watch:

Alright, it doesn't seem like it looks the way I want it with modern CSS. Pretty much, with SDR the phone can't display more colors despite being to display bright colors.

And this is where HDR comes in.

HDR is a new standard of colors for displays which expands the number of colors beyond the 16 million supported by SDR (or at least, on paper it does). VESA has developed a rating system for HDR, based on the max brightness of displays, to rate different displays for how well they can actually use HDR. However, the rating is mostly centered around the display in brightness- not the actual color range supported (color depth)!

Let's look back at SDR's 8 bit color format. What if we expanded the colors to 10 bit (2^10 range for colors- 0 to 1023)? Now we can support over 1 billion colors! This is what HDR properly means- the ability to support more colors. It's tied to brightness of a display because in order to actually show a difference in that many colors, a brighter display is necessary- otherwise, different colors would look the same.

Now onto the issues of HDR. As stated before, VESA's standard is largely centered around the brightness of a display, and there are additional areas of certification for a display to be considered VESA DisplayHDR certified. This means that there are multiple tiers for displays to support HDR. In other words, not every display is made the same, and even though it might be HDR certified, it doesn't quite mean it'll properly display the wider range of colors.

A while ago I got an MSI monitor- it's 1440p resolution, with 165 hertz refresh rate, pretty decent overall for gaming; Its HDR is the worst I've seen. The colors are washed out, red looks especially bland, and the monitor doesn't even cover the SDR space adequately with HDR on. What's going on?

Displays that support HDR still often have to support SDR, too. It's not just a simple conversion to go from 255 to 1024- what is the behavior for this? Should SDR colors show as bright as HDR ones? Should it be limited to 255 out of 1023? Should some colors be made brighter than other colors, to help with human preception?

The biggest issue with my MSI monitor is not in fact the HDR content (although that isn't necessarily great either), but rather with trying to display SDR content at the same time. With the colors being washed out, this suggests that the monitor is actually limiting the SDR color brightness below their standard value, and actually faking true HDR even. So if my monitor goes up to 500 nits in brightness in SDR, while in HDR the SDR actually goes up to 100 or so nits, while HDR moves up to 500.

Furthermore, the monitor might even be making dark colors brighter, or applying weird filters to try to make HDR look good. In the end, it actually turns out that the monitor doesn't support 10 bit color in RGB format!!! This is the center of the issue, as the monitor essentially chops HDR down to SDR quality, rather than expanding its color range to real HDR.

I got another monitor that also has an HDR switch. Turning HDR on, it looks quite a lot better than my MSI monitor, but still lacks in good color depth. On that monitor is also an sRGB mode, which is an old color standard still widely used on the internet and in games and applications. sRGB interestingly seems to show more colors than the HDR mode, although the HDR mode does show brighter. In this case, the monitor knows better how to output 10 (or even 12) bit color, but struggles on displaying it correctly, and once again properly showing SDR content, because there's probably still many shortcuts being made with the color (it wasn't a super expensive monitor after all).

And there is a large issue with HDR- the lack of proper color standards. Yes, there exists the VESA certification, but that only expects displays to be able to display a wide color gamut, aka the color ranges beyond SDR. In SDR, there are tons of color standards, including sRGB, adobe RGB, and formats designed for cinema. There do exist HDR standards, with a common one being Rec. 2100, but the issue is the proprietary formats to properly and accurately display colors, such as Dolby Vision and HDR10+. It's because of these inaccessible standards as to why there's a lack of real decent HDR content outside movies and TV screens. Youtube, for example, supports HDR, but struggles to accurately output good, consistent HDR similar to how my monitors can't properly convert SDR to HDR, or even display HDR well.

So what's the conclusion? Is HDR bad? No, it just needs to be better adopted. Open standards similar to sRGB need to be developed further to be widely adopted in devices and displays. But more importantly, proper conversions from SDR to HDR need to be defined, and most importantly, certification needs to be refined to describe the true range of colors a display can show in SDR and HDR, and SDR+HDR. Just like how USB standardized the way devices can plug into each other seamlessly, an HDR standard should unify all the different modes and ranges of HDR seen today, so in the future there is a consistent standard which looks great for every HDR supported display. And afterwards, there are still plenty of colors the human eye can see that HDR can't yet produce, so building the technology up to show such colors can enable incredible, true color visuals, such as a VR screen which looks more like real life than a brightly lit display.

If you have anything to say or any comments or questions, feel free to share them in the comments below.

No comments:

Post a Comment