What Is HDR For Monitors And Is It Worth It?

Should you get an HDR monitor or is it not worth it? This guide will help you make the best possible buying decision.

The term ‘HDR’ has been quite the buzzword when it comes to high-end TVs, but now this standard is becoming more and more popular among new monitors as well.

What Does HDR Do?

SDR vs HDR 1

Having a high-resolution PC monitor with a high-quality panel boasting excellent contrast ratio, high brightness and wide color gamut does not mean that all of your games and other software will be able to take full advantage of it.

This is where HDR kicks in and implements its metadata to ensure the correct reproduction of all the colors, among other things. HDR monitors and TVs recognize the HDR signal and allow for the image to be displayed the way the creator of the content had intended it. 

what is hdr gaming

HDR Formats: HDR10 vs Dolby Vision

There are many different formats of HDR, so just getting any display labeled as HDR won’t give you the same viewing experience.

Dolby Vision is a more expensive and demanding form of HDR. It requires that the display is capable of at least 4,000-nit peak brightness and 12-bit color depth.

Additionally, Dolby Vision requires a license fee, whereas HDR10 does not – which is one of the reasons why PC and console content creators, as well as display manufacturers, mostly opt for the HDR10 free and open standard.

Unlike HDR10 with static metadata, Dolby Vision offers dynamic metadata implementation, which makes for scene-by-scene brightness regulation and an overall more engaging viewing experience.

Samsung is addressing this via their HDR10+ format, which is both dynamic and royalty-free, and slowly beginning to spread among content and displays. Samsung is also working on the HDR10+ Gaming standard which we should start seeing on new displays and video games in 2022.

Other HDR formats include Advanced HDR by Technicolor and HLG (Hybrid Log-Gamma) by BBC and YouTube. However, here we’ll focus on the open HDR10 standard.

hdr computer monitor

DisplayHDR Standards by VESA

In December 2017, VESA (Video Electronics Standards Association) defined new HDR standards. The DisplayHDR certifications are divided into five groups (DisplayHDR) for LED monitors plus three groups (DisplayHDR True Black) for OLED displays depending on the level of quality.

This way, you will know a bit more about the monitor’s HDR capabilities instead of relying on the ‘HDR-capable/compatible’ and similar labels by certain monitor manufacturers.

Moreover, you can download the exclusive DisplayHDR software and perform tests for the specified color gamut, peak brightness, and contrast yourself.

Note that the monitors that don’t have any of the below-mentioned certifications, but are labeled as ‘HDR-ready’ can just accept the HDR10 signal and essentially ’emulate’ the HDR picture quality via software manipulation. This is referred to as pseudo or ‘fake’ HDR, and it can basically be ignored most of the time.

As you can see in the image below, the entry-level DisplayHDR 400 certification is the only one that doesn’t require the display to have a wide color gamut or local dimming, which is why it doesn’t really bear much meaning.

In comparison to a regular non-HDR monitor, an HDR400-certified monitor only has a higher peak brightness and the ability to accept the HDR signal. So, the HDR picture won’t have improved colors or contrast, just a higher peak luminance, which in most cases results in just a washed-out image.

Some HDR400 monitors do have a fuller color gamut, so they will offer at least a slightly better HDR image quality. Basically, seeing that an HDR monitor has DisplayHDR 400 certification isn’t enough, you will have to look at its color gamut as well.

We have a list of all HDR monitors where we’ve divided the displays into several groups. You can filter them by peak brightness, local dimming, etc.

DisplayHDR Tiers

Local Dimming

For notable HDR picture quality, a LED-backlit monitor will need some sort of local dimming, which is necessary for the DisplayHDR 500 certification and onward.

A standard LED monitor uses global dimming (no local dimming), meaning that when the picture needs to be darker, the entire screen will get dimmed.

Monitors with localized dimming can just dim the parts of the screen that need to be dark without affecting the bright parts, thus effectively increasing the contrast ratio of the display.

There are two types of local dimming displays: edge-lit and direct-lit with full-array local dimming (FALD).

FALD displays have numerous individually controllable dimming zones spread across the entire backlight, which can significantly improve the picture quality.

However, a FALD implementation is quite expensive, and it’s only available in high-end monitors, such as the ASUS PG27UQ and the Acer Predator X35.

Although full-array local dimming isn’t required on paper for the ‘true’ HDR viewing experience (according to UHD Alliance and VESA), it is crucial for the HDR picture quality to really stand out.

Edge-lit displays with local dimming have fewer dimming zones at either left/right or top/bottom of the screen, but they can still provide a better contrast ratio in comparison to the standard displays with global dimming. These displays are also a lot cheaper to make than the FALD ones.

OLED monitors and TVs have self-emissive pixels, so each pixel can individually turn off for true blacks and an infinite contrast ratio, resulting in an incredible HDR viewing experience, especially in a dark room. However, OLEDs can’t get quite as bright as high-end LED displays and they have a risk of image burn-in.

HDR Gaming

SDR vs HDR Comparison

Both console and PC games offer many HDR-compatible titles.

However, when it comes to HDR PC gaming, there are still many difficulties, as most of the software isn’t quite HDR-ready.

For instance, Windows 10 forces HDR on everything once it’s enabled, making non-HDR content unpleasant to look at, to say the least. So, you’d need to manually enable and disable HDR every time, depending on what you’re watching.

FreeSync Premium Pro & G-SYNC Ultimate

When buying a new gaming monitor, most people will opt for a display with a variable refresh rate (VRR) technology, which is branded as ‘FreeSync’ and ‘FreeSync Premium’ by AMD or ‘G-SYNC Compatible’ and ‘G-SYNC’ by NVIDIA.

Not all FreeSync/G-SYNC monitors with HDR can simultaneously run both VRR and HDR, though.

For the best HDR gaming experience, look for gaming monitors with AMD FreeSync Premium Pro or NVIDIA G-SYNC Ultimate, which allow for HDR and VRR to run at the same time without any perceptible input lag added.

There are exceptions, though, such as the Aorus AD27QD and the Philips 436M6VBPAB, which have the regular first-gen FreeSync technology but can run VRR and HDR at the same time.

FreeSync Premium Pro also allows you to take advantage of compatible games by ensuring optimal color gamut and tone mapping between the display and the game.

Conclusion

As you can see, there are plenty of things that have an impact on whether an HDR monitor is worth it.

There are some HDR monitors that have an awful HDR picture quality but are worth the money despite that since they offer good other specifications.

In contrast, other HDR monitors might offer a brilliant HDR picture quality but have other panel-related flaws.

So, always check for monitor reviews of the displays you’re interested in to get the information you need.

Relevant For You

Change FreeSync Range
How To Change FreeSync Range Using CRU
Rob Shafer

Rob is a software engineer with a Bachelor’s degree from the University of Denver. He now works full-time managing DisplayNinja while coding his own projects on the side.