What is HDR and how does it improve the gaming or video experience?

Video games and movies are two of the most popular forms of entertainment, and High Dynamic Range (HDR) is a technology that can drastically improve the gaming and video viewing experience. HDR allows the image to show a greater range of colors, emphasize details, and make the image look more like the real world. As a result, the image will appear brighter, sharper and richer than traditional digital images.

The human eye is the basis of the sense of sight and works by transforming the light energy it receives into electrical signals that it sends to the brain. Its ability to adapt to any lighting conditions, while maintaining colors and contrast, is extraordinary. No manmade machine has ever been able to match its potential, but there are technologies that attempt to come close to what human vision has been able to achieve. such as those coming from HDR.

Before starting, say that the geniuses of marketing have managed to introduce this technology even in the soup as they usually do with any concept that has pull among consumers. Thus, today you will see it “supported” in all kinds of products, both those that are capable of recreating it as cameras, video or mobile or those where the content is represented, display screens such as televisions or computer monitors.

All that glitters is not gold and its implementation varies greatly, from products that – although they promise it – offer nothing over and above SDR (standard dynamic range) to those that are capable of improving the experience in massive tasks such as games or videos. We put you in the picture so that you can distinguish between them.

What is HDR

Translated as high dynamic range is a technology that aims to recreate digital images (still or moving) as close as possible to what the human eye would see in the real world. Although it may seem novel, attempts at high dynamic range photography have been known since 1850 using multiple exposures, the only thing that the geniuses of the time could use given their enormous technical limitations.

As early as the 1930s and 40s, improvements were made using the same initial idea, several layers of exposed film to achieve a single final image that reflected the changes in light that are perceptible to the human eye. The era of computer graphics that began in the 1980s was a before and an after, while the increase in processing power. of today’s computers has allowed their potential to unfold.

Medical research into the workings of the human eye; new techniques such as tone mapping; wider color gamuts; increased maximum brightness; darker blacks for shadows, along with more balanced contrast ratios, have made great strides in this function and thereby making digital images perceived as realistic and accurate, approaching what the human eye would see in the real world.

What is HDR

HDR in monitors and televisions

In terms of display screens, there are several HDR video formats you should be aware of, with three main ones: HDR10, HDR+ and Dolby Vision.. These media profiles are applied both to how the content is recorded or rendered and to the ability to display it on the screens that support it and that are the protagonists of this article.

Although they all aim for the same thing, to display more realistic images, they have different requirements, specifications and properties, and their results are quite different. Also they differ in terms of licensing costs, transfer function, metadata and compatibility, but to give you an idea, they require a minimum of 10 bits in the panel and tone mapping. We leave you with the essential criteria for the different profiles, which will ultimately define whether or not they are HDR compliant:

Bit depth

In general, computer monitors, laptop screens, televisions and the rest, including those of smartphones, use 8-bit colors. This allows them to display 16.7 million colors. HDR displays increase the depth to 10 or 12 bits, allowing them to display 1.07 billion and 68.7 billion colors, respectively. For a display to qualify as HDR10 and HDR10+ it must display a color depth of 10 bits, while Dolby Vision supports 12 bits. We are not aware of any consumer displays with 12 bits, so the 10-bit

Maximum brightness

This is the minimum amount of maximum luminance that a display with HDR can achieve. For displays to display dynamic range images they need higher brightness levels than normal displays, SDR (low dynamic range). The maximum brightness is measured in cd/m² and for HDR it has to be at least 400 cd/m². Anything below that, forget about it, it will not be able to deliver high dynamic range images.

Maximum black brightness

To get close to reality, in addition to high peak luminance for bright image areas, they must also be able to display dark areas using very dark blacks. That is where this parameter comes into play. Typical values for this attribute are less than 0.4 cd/m², but there is no requirement as far as HDR protocols are concerned. However, the DisplayHDR standard proposed by VESA, which we will discuss later, does have specific values and considers that any display that can display blacks with a brightness of less than 0.0005 cd/m² is considered to be True Black.

Tone mapping

Content created with HDR such as movies or games, has much higher brightness values than a consumer display can show. Some sequences in a movie will easily exceed 1,000 cd/m², but then how do you handle it on a lower level display? This is where another technique currently in use comes in, tone mapping, which uses algorithms to reduce the brightness of the filmed content to match the maximum that the screen can display. In this scenario values such as contrast suffer, although the displays still show more and better detail than SDR.

Metadata

Content created with HDR stores information in metadata. Devices use them in their playback to correctly decode the content. The problem is that not all formats use the same type of metadata. HDR10 uses static metadata, which means that the settings applied to how the content is displayed are the same from start to finish. HDR10+ and Dolby Vision, use dynamic metadata, which means that the images displayed can be adjusted on the fly, using different brightness ranges in different scenes, or even for each frame of a video.

HLG

Hybrid Log Gamma and represents an HDR standard that allows content distributors such as television companies to offer television content in both SDR and HDR using a single broadcast. When that broadcast arrives at the TV or monitor, the content is displayed in one of them, depending on the display media.

DisplayHDR, the VESA standard

It is the standard proposed by the international association of electronics manufacturers VESA to handle HDR in display screens. It is connected to the formats that we saw previously, but they are different and what VESA does is certify a minimum performance level. that guarantees quality in the delivery of high dynamic range content and differentiate them from typical SDR (standard dynamic range).

The association includes all major manufacturers of displays, PCs, related components such as graphics chips or operating systems such as Microsoft, so it is a good reference when purchasing devices and thus protect users from misleading advertising that is not lacking in technology marketing. The association has defined the DisplayHDR certifications by levels and the following image shows its main features.

DisplayHDR 400

This is the absolute minimum to obtain dynamic range certification and matches the HDR10 format in requiring a brightness of 400 nits. Anything below that luminance level is not HDR. As for color gamut it calls for the red-green-blue (sRGB) standard, with a black level luminance of only 0.4 cd/m2. This is the basic level in monitors and the most commonly used in notebook displays.

DisplayHDR 600

The next level requires a minimum of 600 nits of brightness. In addition, it lowers the luminance of black levels to only 0.1 cd/m2 which will be especially noticeable in contrast. It includes a wider color gamut (WCG), extending the color spectrum through backlighting and using local dimming zones to divide the monitor backlight and illuminate them differently depending on the image displayed on the screen.

DisplayHDR 1000

For those looking for the most realistic and spectacular images, it improves the brightness level up to 1000 nits and the luminance of black levels to just 0.05 cd/m2. As above it includes local dimming zones but with many more areas of variable brightness. These are the most important and recognizable levels of the standard, but there are intermediate levels such as DisplayHDR500 and higher ones such as DisplayHDR 1400, with more brightness and less luminance in black levels.

DisplayHDR “True Black”

These are special versions of the standard that allow up to 100 times deeper black levels and greater dynamic range. The most advanced is DisplayHDR True Black 600, which requires a minimum of 600 nits of brightness. In addition, it lowers the luminance of black levels to just 0.0005 cd/m2 which will be especially noticeable in contrast. It includes a wider color gamut (WCG), expanding the color spectrum through backlighting and using local dimming zones to divide the monitor’s backlight and illuminate them differently depending on the image displayed on the screen.

HDR in games or movies

Beyond standards and technical features, the most interesting for users comes from the ability to use HDR in games or videos or in professional tasks such as photo editing or illustration.

In gamesIn games, we can say that high dynamic range rendering has been used in development for two decades and one of the first approaches was the Lost Coast expansion for Half-Life 2. Today, any current 3D game makes use of this feature when creating them, although not all of them are available when playing them. There are not as many as we would like, but there are more and more. Most cases only provide an on/off switch, while the more advanced ones provide additional settings to adjust the processing according to the capabilities of the screen.

By way of approach, that not with a real HDR which can only be provided by the game developer and -obviously- the display manufacturer, Microsoft released in Windows an Auto HDR feature that, roughly speaking, is the same technology that can be found in the Xbox Series consoles, and as in the same, what it does is to improve the quality of the content in games even though they have not been developed with this technology, provided they are based on Direct X 11 or Direct X 12.

Potentially more than 1000 games will be supported, but we repeat, it is not a real HDR. From what we’ve tested so far, Auto HDR results are mixed, with some games not changing at all, while others transform quite a bit. It will depend on how many high-precision shaders and render targets are used before any tone mapping is applied. The ideal would be true HDR, which is getting closer and closer due to the level of monitors and the power of today’s dedicated graphics cards.

Regarding videoIn addition to a monitor or TV with HDR support, we will need that the content has been encoded for this technology and playback devices such as modern Blu-ray players as a major exponent or dongles from Amazon, Apple or Google that support it in some of its standards.

In streaming services, Disney+, Netflix and Prime Video offer content encoded in one of the HDR formats, but you will have to activate them in their menus (some are additional payment) and have a good Internet connection. To watch it on PCs, the graphics chip and monitor must support HDCP 2.2 and have HEVC codecs installed. On PCs, Netflix only supports 4K HDR streaming on Edge and its dedicated app on Windows, while Disney+ and Prime Video will work on most browsers.

How to enable HDR in Windows

Windows 10 and Windows 11 offer support for high dynamic range content and in the most recent versions of the system you will see it defined as. Windows HD Color. The objective is as discussed. Support enhanced brightness and color features compared to traditional standard dynamic range or SDR content.

It has to be said that Windows really only supports up to HDR10 format (10-bit panels) and still has a long way to go to improve, but Microsoft is on it from what we’ve seen with Auto HDR. The operating system can display high dynamic range content on photography, videos and games as long as the hardware (graphics card) and monitor allow it. Enabling it is very simple as you will see in the image. If your hardware supports it experiment with it.

Hopefully you will find useful this approach to a technology that is gaining ground in our electronic devices and that you will see massively promoted (not always as real as it promises). The human eye is unattainable, but HDR is the closest thing to the real world we can get in digital imaging.

Click to rate this entry!
(Votes: 0 Average: 0)
Share!
Leave a Comment