Many of you may wonder why a standard gamma curve is defined as gamma 2.2. The main reason that a power-law relationship exists between the output luminance and the input voltage or digital value is because the visual system does not operate in a linear way. Let’s take a look at Figure 2. At the bottom row, linear intensity represents the intensity increases from black to white in a linear fashion. At the top row, visual encoding represents the intensity increases from black to white in a power-low fashion. Please note that between 0.0 and 0.1, there is a big visual gap in linear intensity whereas it is much less visual gap between 0.0 and 0.1 in visual encoding. And from 0.9 to 1.0 in linear intensity, the difference is not perceivable, whereas it is perceivable in visual encoding. Looking at the overall grays in visual encoding row, the perceived differences between each gray patch are almost identical. This phenomenon was also found in Ebner and Fairchild’s study in 1998, where they found using an exponent of 0.43 to convert linear intensity into lightness for neutrals can provide an optimal perceptual encoding of grays. And the exponent of 0.43 is approximately 2.33 and is quite close to gamma 2.2. Hence, gamma value of 2.2 has become the golden standard of digital display for a proper calibration.
Figure 2: Visual Differences in Visual Encoding and Linear Intensity
So how does gamma curve impact overall image quality or perception? Gamma 2.2 delivers a balanced or ‘neutral’ tone between highlights and shadows, and you can distinguish the grays easily in between. Gamma 1.0 is an interesting curve to look at. This is a 45-degree straight line relationship between the input signal and output luminance. This is also a ‘by-pass’ scenario where no processing is done in the display end. So you will see a very bright and ‘flat’ image, where almost no contrast is present, such as Figure 3.
Figure 3: Images of Gamma 1.0 (on the left) comparing to Gamma 2.2 (on the right)
Gamma 1.8 was very popular due to Mac OS. Gamma 1.8 curve produces slightly brighter images than gamma 2.2 curve so sometimes it is more preferred in some cases. However, since Mac OSX 10.6, gamma 2.2 has become the standard gamma curve for Mac OS as well. An example of gamma 1.8 versus gamma 2.2 is illustrated in Figure 4.
Figure 4: Images of Gamma 1.8 (on the left) comparing to Gamma 2.2 (on the right)
Gamma 2.4 is widely used in movie and TV industries due to Rec. 709 standard. The slightly enhanced contrast brings out the saturation of the colors and stimulate viewers perception and preference. However, the overall brightness of the images may be lowered. An example of gamma 2.4 versus gamma 2.2 is illustrated in Figure 5.
Figure 5: Images of Gamma 2.4 (on the left) comparing to Gamma 2.2 (on the right)
Gamma 2.6 starts to gain its popularity because of the latest DCI-P3 standard. DCI-P3 is endorsed by lots of newer digital cinema theaters. With a gamma 2.6 curve, the images will definitely look darker but very saturated. And this is the effect that is required by the director, hence, DCI-P3 required gamma 2.6 curve. An example of gamma 2.6 versus gamma 2.2 is illustrated in Figure 6.
Figure 6: Images of Gamma 2.6 (on the left) comparing to Gamma 2.2 (on the right)
Nevertheless, what will happen if a gamma curve is not smooth? If a gamma curve is not smooth; that means the transition from black to white is not in strict incremental, such as shown in Figure 7. Then the visual differences in grayscale will look not perceptually different, and also with coloration (not neutral gray). And when this phenomenon applies to a grayscale image, the image will loss details and result in a different image than it is supposed to be.
Figure 7: Not smooth in gamma curve can result in artifacts in perceptual grayscale and RGB ramps (on the left). No artifact presents in perceptual grayscale and RGB ramps in smooth gamma curve (on the right).
Figure 8: Select Preferred Gamma on Monitor OSD
Figure 9: Select preferred Gamma with
BenQ Palette Master Element Software
Since we now have some basic ideas of how gamma curve impacts the image quality and how important to have a good gamma curve, how do we adjust the gamma curve? Usually there are two ways of doing that. The first way is straightforward, you can just look for the OSD on you monitor. Usually on a professional color management monitor, such as BenQ SW series monitor or PD series monitor, there is a ‘Custom Mode’ in the ‘Color Mode’ option. In the ‘Custom Mode’, you are allowed to select different settings, such as color gamuts or gamma curves. Under ‘Gamma’ option, you can select factory calibrated gamma curves, ranging from 1.6 to 2.6 with 0.2 intervals.
Another way of adjusting gamma curve in SW series monitor is to utilize BenQ proprietary calibration software, Palette Master Element. Connecting with an external calibrator, such as X-rite i1 Display Pro, i1 Pro 2 or a Datacolor Spyder 5, you can simply select the gamma curve you would like in the Advanced mode. And following the instructions on the screen, after completing the calibration, you can be sure your gamma curve is ready for you to use in Calibration 1, 2 or 3. It is as simple as that. This is the preferred method to adjust your gamma curve if you have one of our SW series monitors.
We have talked about the definition of gamma curve in the early paragraph of this article, and its important link to visual response. We also looked at how gamma curve can affect the image quality or perception, and some typical gamma values and their applications. Lastly, we learned the importance of maintain a smooth gamma curve and how to select different gamma curve on BenQ SW/PD monitors. With Palette Master Element software and an external calibrator, you can also calibrate your gamma curve on your SW monitors.