With how widespread HDR technology has become, there are more and more TV sets, monitors, and projectors with the HDR badge on them. But the truth is, with different display devices, the definition of HDR might vary. Compared with TVs, the way projectors show video to the audience is actually closer to the projectors in the cinema. Even though the manufacturers build their projectors in different ways, the basic concept of what HDR is still the same.
A common doubt among the consumers when facing more and more home theater projectors in the market is if they really can be qualified as HDR devices. Therefore, this article will start from the projectors. Taking the nature of the technology behind projectors and its limitation into account, we will explore if it can truly be HDR when the video is shown through a projector. Finally, we will find the definition of HDR for projectors.
Theoretically, HDR are standards developed for TV sets. They hadn’t taken projectors into count. HDR (High Dynamic Range) are standards for displays that have the capability to show images with more dynamic range, allowing the image to be shown more clearly, especially in terms of showing highlights and shadows of the image, and the color rendering. It would elevate what the audience can see in the highlights and show deeper shadows, showing the original footage in a fuller and more faithful way, representing what human eyes can see.
But theoretically, HDR standards are set for displays like TV sets; they didn’t take the nature of projectors into count. For example, when HDR content requires a display to show pixels in a certain brightness, the display would just do so, but the brightness of the video the audience will perceive from a projection will vary because of the size of the screen, distance from the projector, environment, and material of the screen. And based on the physical differences between how a screen and a projector were made and generate images, the maximum brightness of a TV set may be higher than normal projectors.
Therefore, the definition of HDR for projectors and the way we should calibrate them should be different. As we mentioned, when a projector is projecting images, depending on the distance between the projector and the screen and the material of the screen, the brightness we perceive will vary, unlike a TV set. And because of the way they generate the image, most projectors won’t be as bright as a TV screen. Many manufacturers are now trying to create cinema projectors that will fulfill all potentials that HDR standards promised with higher image quality, as they are working on home projectors.
Normally, there are two keys when showing an HDR image: high dynamic range and wide color gamut. HDR has higher dynamic range, which allows the content to have more contrast between its highlights and shadows, therefore showing more details. The other important factor we should know is, HDR standards have wider gamut than SDR standard: they can present colors that exist in DCI-P3, even Rec.2020, closer to the colors that human eyes can see.
For projectors, no matter if they come with SDR or HDR technology, their methods used to present highlights and shadows will affect the black and white of the image they project. In recent years, with the development of HDR technology, projectors are getting better at showing highlights and shadows, that’s because the technologies of high dynamic range and wide color gamut are getting better and closer to HDR standards.
Dynamic Iris is a technology that allows projectors to control how much light gets through its projecting lens. Depending on what the scene needs, it can reduce the light that passes through, making shadows of an image appears darker. When manufacturing a projector, engineers have to decide how the 0-1000nits variation of brightness in HDR should be translated into the brightness changes based on the capabilities of the projector.
When presenting HDR with projectors, it’s important to have better control over the brightness on the darker side. For example, under the circumstances that maximum brightness remains intact, finding a way, like using Dynamic Iris, to show the shadows in a darker way will create more shades between the highlights and shadows, more contrast, and therefore enhance the HDR experience.
Compared to TV sets or monitors, what is more challenging for projectors is that the brightness that the audience perceives will vary based on the size of the screen they project images onto. Therefore, manufacturers have to go through some extra processes when manufacturing HDR capable projectors. For example, when calibrating colors and EOTF curves (EOTF curves representing how human eyes react to brightness changes), the curve for HDR TVs can be more dramatic while the curve for projectors should be smoother, so different devices with different natures of generating images can both show complete HDR details.
When defining how the HDR projector should interpret the HDR content, precise tone mapping, like how to transform an image that has 150nits range difference to 100nits range, while still maintaining the contrast and texture between highlights and shadows. It takes the experience and expertise of the engineers to find the best performance within the capabilities of the projector to keep all the details and tonal changes of the image.
There are a few basic elements needed to have a true HDR experience: the display, the player, and the content. They all have to be HDR in order to have a full HDR experience. Most consumers won’t be able to identify if they are watching HDR content if there are not two or more monitors showing content in different standards. Sometimes even professionals could miss it. When it comes to projectors, there are more factors that will affect the image quality the audience perceives, like the environment, the distance from the projector to the screen, the viewing angle, or the materials of the screen all matters. Plus the personal preference of the audience is also a consideration.
If the projector is closer to the screen, the image will be brighter, and it gets darker when the projector is moved away from the screen. And the gain value of the screen will make a difference too. Gain value is an index measuring how reflective the screen is. A screen with high gain value will have brighter maximum brightness to the audience’s eyes, while a screen with low gain value will make the image darker. But it’s not ‘the brighter, the better’ here. With a projector, the brighter the image it projects naturally means the whole environment will be lit by the projection, causing the details in the shadows to not be presented faithfully.
The size of the screen also matters. Most of the screens in the market are between 100 inches and 120 inches. And the gain value usually falls between 1.0 to 2.0, sometimes even higher. Usually a screen with high gain value looks brighter in the middle, while a screen with low gain value has better performance in the shadows, showing true black. Considering the common environment in households, for most home projectors the screen should have a lower gain value, around 1.0. That will create a wider viewing angle and better details in the shadows.
But it still depends. For example, when watching 3D movies, since it’s usually darker, some consumers would choose a screen with higher gain value so it will be brighter and give them a better viewing experience.
So, compared to other devices, HDR projectors are closer to the experience of seeing a movie in the cinema. Manufacturers must put in effort to make the colors come out of a projector closer to how people would normally see, try to present the image in a faithful way to the directors’ intent, and aim to provide the most comfortable and real visual experience.