Understanding HDR in 10 Minutes

You could enjoy HDR content of NETFLIX, amazon videos and YouTube with your 4K monitor at your desk.

There are more and more 4K TVs, 4K monitors, 4K projectors, 4K UHD BDs and 4K players available on the market today, and if you’ve been paying any attention, you’ve most definitely noticed something called “HDR,” which, according to many Hollywood filmmakers, is even more important than 4K.

So, what exactly is HDR?

HDR, or High Dynamic Range, differs from SDR, or Standard Dynamic Range. Back when the film industry was setting format standards and procedures, there were numerous limitations regarding the equipment and the overall environment. As a result, many aspects of the image had to be compressed or compromised in the process of creating a video. Brightness was one such glaring example. In the past, TV sets only had luminance of 100 nits (nit is a unit of luminance that denotes candela per square meter), and it wasn’t until much later did TV sets begin to have better luminance levels (about 250 to 400 nits). However, most image creation processes stuck with old formats and continued using them.

In fact, luminance of 100 nits only accounts for a tiny portion of the entire range of luminance that human eyes can perceive (approximately from 0.001 to 20,000 nits). As technology continues to advance, we have also come to realize that we must do away with formats and standards left over from the CRT era in order to minimize the gap between what we physically see with our own eyes and what we see on TV. This is where HDR comes in. Its aim is to give videos a greater dynamic range of colors and light, so that the video output can be as close to the original material as possible.

Now, 4K UHD BDs use HDR 10 as the HDR standard, which increases the luminance to 1,000 nits, or ten times better than that of conventional SDR TVs. Although 1,000 nits is still a long way away from the entire range 20,000 nits that is perceivable by the human eye, it is nonetheless a major step forward.


Given all the differences between them, it should be fairly easy for consumers to distinguish between SDR and HDR.

With HDR technology, an image’s bright spots will be brighter and dark spots darker, giving it a better coloration. This works especially well for parts of the image that have bright colors by minimizing distortions. For instance, when a clear sky is displayed on the screen, HDR can faithfully render the sky’s blue color. SDR, on the other hand, might make the sky seem paler or even completely white due to color information loss. In the case of nighttime neon signs, HDR can also make the colors more saturated and the lights brighter, giving the image more vibrant colors. Meanwhile, with SDR, the image would have a lower contrast, paler colors, and even a color cast.

The Difference between HDR and 4K

4K refers to the 3840×2160 display resolution with roughly 8.3 mega pixels, which allows images or videos to have a high definition even when they’re shown on large screens. HDR, on the other hand, can give images or videos better contrast and coloration. While both HDR and UHD are designed to enhance viewer experience, they are two completely different and separate technologies.

Was this article helpful?

Yes No