Price Qty

BenQ Malaysia Facebook Page was Hacked by Illegal 3rd Party with Scam

Read More

BenQ Malaysia Facebook Page was Hacked by Illegal 3rd Party with Scam

Ho Ho Ho! Merry Christmas!🎄 Enjoy the best deals of the year.

Read More

Ho Ho Ho! Merry Christmas!🎄 Enjoy the best deals of the year.

Create BenQ account to get tech-savvy tips & member-exclusive offers.

Create Now

Create BenQ account to get tech-savvy tips & member-exclusive offers.

What is HDR? The Best Image Close to Reality

BenQ
2019/06/01

In recent years, the quality of images that general audiences can view has significantly improved. And there are more and more 4K TVs, 4K projectors and 4K displays in the market. The one constant among brochures and ads for these products is that they emphasize having “HDR”. It seems that HDR technology is a necessary part of all the latest TVs and display devices. But, do you really know what HDR is?

Since there is not enough understanding about HDR, we would like to start from the basic con-cepts, leading consumers to understand not only what HDR is, but also the close relationship between image quality and the screen itself, while providing proper knowledge that consumers will need when choosing between HDR TVs and projectors.

Is the HDR on TVs the Same as the HDR in Photography?

Many people think about the HDR mode in their smartphone camera when they hear the word “HDR”. Even though it’s the same acronym, the HDR in photography is fundamentally different from the HDR on TVs, projectors, or other display devices.

HDR imaging is a special way to create a picture. Once you start to take an HDR picture in HDR mode of your camera, it will take multiple pictures with multiple exposures. These are then composited into one picture, with wider range of the dynamic range, thereby fixing the under-exposed and overexposed parts of the image.

The HDR for displays and TVs is a new display standard, which provides more dynamic range in showing the highlights and shadows of the image. It allows the details of the images to be shown more clearly and elevates the quality of the image, rendering it closer to the original scene.

HDR in Photography
HDR on TVs
What is HDR? Why Do We Need HDR?

It’s all about faithfully reproducing what we see in real life.

As consumers, when we are choosing among TVs, computer monitors, or other display devices, “image quality” is usually our first criterion. But the higher resolution that is often being dis-cussed in the market does not necessarily mean higher image quality.The main factors that affect our perception of an image include brightness, contrast, color, and sharpness. Among them, contrast and color are the keys that affect how the viewer feels about the quality of the image.

HDR (High Dynamic Range) as a standard comes with more dynamic range, allowing displays to show images in a clearer way, especially in the details of the highlights and shadows. It allows highlights to be shown brighter than used to be possible. And the shadows in HDR can be truly dark and deep. So the original image can be shown more completely and faithfully, closer to what human eye can see. Therefore, HDR can also be called extended dynamic range, which more precisely expresses what this standard means.

And what is the dynamic range? Dynamic range is the ratio between the brightest and darkest values that a display can show. So if we want to have more dynamic range in a display, not only does the display have to be able to show brighter images in a brighter way, it also has to be able to show the darker images darker. That's what really makes a display capable of showing more dynamic range, rather than just capable of being brighter.

Realistic HDR
Bad HDR
The Difference Between HDR and SDR

HDR can handle the vast difference between highlights and shadows, so the brighter part of the image won’t be clipped and keeps the detail in the shadow.

Before we explain how HDR makes the image quality better, we have to understand what SDR (Standard Dynamic Range) is. SDR is the mainstream standard in the video-making world. Whether it’s a film studio or production company making a film or content, or display manufac-turers are setting up the specs for their products, they all follow this standard. Normal TVs are all adopted from the SDR standard as well.

In the past, when the TV industry was setting up the image standards and workflow, due to the limited capabilities of display technology and resource limitations, there was quite a bit of in-formation that had to be compressed or abandoned during the process. This is most clearly seen in the brightness. In the early days, the standard set for TVs limited the maximum bright-ness to 100nits (“nit” stands for candela per square meter). And this standard was continually used in later decades.

Even though there have been better TV screens being made, now they can roughly reach between 250 nits to 400 nits, the image production pipeline still worked within the old standard. They compressed the information into the range of 100 nits, then presented it through displays. The compressed image is not close to reality.

Compared to how traditional SDR TV showed the image, HDR provides a significantly wider range of contrast, allowing the audience to see more color and more details in the highlights and shadows with more shades of subtle changes, making the image closer to how it was cap-tured. But beware: there is more than one HDR standard.

The Common HDR Standards: HDR 10, HLG

Currently the mainstream HDR standards are HDR 10 and HLG; so far HDR 10 is mainly used in UHD Blu-ray, and HLG is mainly used for broadcast TV stations, like NHK and BBC. Therefore, when choosing displays, you should pick the displays that support HDR 10 and HLG, so you can fully experience the HDR content.

HLG (Hybrid Log-Gamma) is the HDR standard for broadcast signals. It doesn’t carry metadata, therefore how well the image is shown is determined by the display used. Meanwhile, HDR 10 contains metadata in its signals. When it’s shown on a display that supports HDR 10 standard, the display can read the metadata and use the corresponding IC to provide the value, so the image will be shown as it was designed to.

HDR 10 is currently most common HDR standard. Almost all HDR capable displays can show content in HDR 10 standard. Its brightness can reach 1000 nits, 10 times more than the tradi-tional SDR TV standard. As long as the displays support this standard, it will be capable of showing the image closer to the original scene. And the maximum brightness will affect how the image is shown.

High Resolution is not HDR. Contrast is the Key

HLG (Hybrid Log-Gamma) is the HDR standard for broadcast signals. It doesn’t carry metadata, therefore how well the image is shown is determined by the display used. Meanwhile, HDR 10 contains metadata in its signals. When it’s shown on a display that supports HDR 10 standard, the display can read the metadata and use the corresponding IC to provide the value, so the image will be shown as it was designed to.

If you want to make an image look sharper, the most important element that will affect this is the contrast. Usually with more contrast, the details of the image will be clearer. By increasing the contrast between the brightest part and the darkest part of the imaget will appear sharper to human eyes. Because of this, HDR usually makes consumers feel that the 4K HDR screen has a clearer image and higher resolution. This is because of the contrast, rather than the viewers are actually seeing more pixels.

The advantage of HDR is it can provide better rendition of highlights and shadows. If there is truly good HDR content, there will be more contrast. For manufacturers, there are two ways to provide more contrast. One is to raise maximum brightness. The other is to make the minimum brightness very dark. Therefore, it is necessary for HDR displays to have a higher maximum brightness. But that is under the premise that it will increase the contrast of the display, so it will be truly HDR capable and can provide more contrast, thus making the details of the image clearer.

Notes for Creating HDR Content

Contemporary digital cinema cameras and still cameras usually have a dynamic range higher than current HDR display standards can show, therefore there is no need for extra setting for the imaging process for their image to be shown in HDR standard. Even film prints from five or six decades ago can still be remastered as contemporary HDR content.

If it’s confirmed before filming that the content will be shot in HDR, then the cinematographer has to be more careful with the exposure to avoid overexposing or underexposing the whole image. For example, many cinematographers would put a very strong light from outside the window when they film an interior scene and don’t want to show what’s outside of the win-dow; with this traditional technique the scene out the window will be overexposed and appear to be white in an SDR image. But in HDR, it will be much brighter than the rest of the image be-cause there is more contrast to be shown. Therefore this technique will make the image lose its aesthetic value and distract the audience from the subject. So when filming for HDR, the scene should be captured as close to the human eye’s viewing capabilities by using the proper light-ing techniques, so it can take advantage of an HDR image.

Aside from shooting, the color grading process also plays an important part in creating a good HDR image. These professionals have to be more careful with the details and the gradesto fully recreate the true color of the image.

Learn

Was this article helpful?

Yes No

Subscribe to Our Newsletter

Stay tuned for our product launches, upcoming news and exclusive benefits.

Subscribe
TOP