The term ‘HDR’ has been quite ordinary for a while when it comes to high-end TVs, but now this standard is becoming more and more popular among new monitors as well.
So, should you care or is it just another passing fad? Well, the answer lies somewhere in between, at least for now.
What Does HDR Do?
Having a high-resolution PC monitor with a first-class panel quality and excellent contrast ratio and color reproduction doesn’t mean that all of your games and other software will be able to take full advantage of it all.
In fact, apart from professional applications for color-critical work, most regular software cannot fully utilize the extended color gamut that the display boasts unless the hardware somehow emulates that limited color space.
This is where HDR kicks in and implements its metadata to ensure the correct reproduction of all the colors, among other things. HDR monitors and TVs recognize the HDR signal and allow for the image to be displayed the way the creator of the content had intended it.
HDR Formats: HDR10 vs Dolby Vision
There are many different formats of HDR, so just getting any display labeled as HDR won’t give you the same viewing experience.
Dolby Vision is a more expensive and demanding form of HDR. It requires that the display is capable of at least 4,000-nit peak brightness and 12-bit color depth. Additionally, Dolby Vision requires license fee, whereas HDR10 does not – which is one of the reasons why PC and console content creators or display manufacturers opted for the HDR10 free and open standard.
Unlike HDR10 with static metadata, Dolby Vision offers dynamic metadata implementation, which makes for scene-by-scene brightness regulation and overall more engaging viewing experience. Samsung and Amazon Video plan to address this via their HDR+ format that will be both dynamic and royalty-free.
Other HDR formats include Advanced HDR by Technicolor and HLG (Hybrid Log-Gamma) by BBC and YouTube. However, here we’ll focus on the open HDR10 standard, which both professional and gaming monitor manufacturers, as well as PC and console video games, have chosen to work with, at least for now.
Ultra HD Premium Standard
According to Ultra HD Alliance, a display needs to uphold the following display specifications for the best HDR viewing experience:
- At least a 1,000-nit peak brightness and 0.05-nit or less black level – or at least 20,000:1 Contrast Ratio (For LCD)
- At least a 540-nit peak brightness and 0.0005-nit or less black level – or at least 1,080,000:1 Contrast Ratio (For OLED)
- 4K Ultra HD Resolution: 3840×2160
- 10-bit color support covering at least 90% DCI-P3 color space (125% sRGB, 117% Adobe RGB)
- HDMI version 2.0
Most displays, whether TVs or monitors, don’t meet all the requirements but rather offer only limited HDR support. To ensure you’re getting a ‘true’ HDR display, look for the Ultra HD Premium logo (picture below), which guarantees that the screen is approved by UltraHD Alliance.
DisplayHDR Standards by VESA
In December 2017, VESA (Video Electronics Standards Association) had defined new HDR standards. The DisplayHDR certifications are divided into five groups (DisplayHDR) for LED monitors plus two groups (DisplayHDR True Black) for OLED displays depending on the level of quality.
This way, you will know precisely what the HDR spec includes in terms of performance quality instead of relying on the ‘HDR-capable/compatible’ and similar labels by certain monitor manufacturers.
Moreover, you’ll be able to download the exclusive DisplayHDR software and perform tests for the specified color gamut, peak brightness, and contrast yourself.
Note that the monitors that don’t have any of the below-mentioned certifications, but are labeled as ‘HDR-ready’ can just accept the HDR10 signal and emulate the HDR picture quality via software manipulation. This is referred to as pseudo or ‘fake’ HDR, and it can essentially be ignored.
As you can see in the image below, the entry-level DisplayHDR 400 certification is the only one that doesn’t require from the display to have a wide color gamut or local dimming, which is why it doesn’t really bear much meaning.
In comparison to a regular non-HDR monitor, an HDR400-certified monitor only has a higher peak brightness and the ability to accept the HDR signal. So, the HDR picture won’t have improved colors or contrast, just a higher peak luminance, which in most cases results in just a washed-out image.
Some HDR400 monitors do have a fuller color gamut, so they will offer at least a slightly better HDR image quality. Basically, seeing that an HDR monitor has DisplayHDR 400 certification isn’t enough, you will have to look at its color gamut as well.
We have a list of all HDR monitors where we’ve divided the displays into three groups: those with true HDR, limited HDR, or just software-emulated HDR.
For notable HDR picture quality, a LED-backlit monitor will need some sort of local dimming, which is necessary for the DisplayHDR 500 certification and onward.
A standard LED monitor uses global dimming (no local dimming), meaning that when the picture needs to be darker, the entire screen will get dimmed.
Monitors with localized dimming can just dim the parts of the screen that need to be dark without affecting the bright parts, thus effectively increasing the contrast ratio of the display.
There are two types of local dimming displays: edge-lit and direct-lit with full-array local dimming (FALD).
FALD displays have numerous individually controllable dimming zones, which can significantly improve the picture quality. However, a FALD implementation is quite expensive, and it’s only available in the high-end monitors such as the ASUS PG27UQ and the Acer Predator X35.
Although full-array local dimming isn’t required on paper for the ‘true’ HDR viewing experience (according to UHD Alliance and VESA), it is crucial for the HDR picture quality to really stand out.
Edge-lit displays with local dimming have fewer dimming zones, but they can still provide a better contrast ratio in comparison to the standard displays with global dimming. These displays are also a lot cheaper to make than the FALD ones.
Both console and PC games already offer many HDR-compatible titles.
Some of the popular PC games that use HDR include Shadow Warrior 2, Deux Ex: Mankind Divided, Resident Evil 7, Paragon, Mass Effect: Andromeda, Obduction, Hitman (2016), etc.
You can track newly supported titles in this HDR PC games list. While some new games are made with HDR in mind, others will provide HDR support via a patch update.
However, when it comes to HDR PC gaming, there are still many difficulties, as most of the software isn’t quite HDR-ready.
For instance, Windows 10 forces HDR on everything once it’s enabled, making non-HDR content unpleasant to look at, to say the least. So, you’d need to manually enable and disable HDR every time, depending on what you’re watching.
FreeSync 2 HDR & G-SYNC Ultimate
When buying a new gaming monitor, most people will opt for a display with a variable refresh rate (VRR) technology, which is branded as FreeSync by AMD and G-SYNC by NVIDIA.
Not all FreeSync/G-SYNC monitors with HDR can simultaneously run both VRR and HDR, though.
For the best HDR gaming experience, look for gaming monitors with AMD FreeSync 2 or NVIDIA G-SYNC Ultimate, which allow for HDR and VRR to run at the same time without any perceptible input lag added.
As you can see, there are plenty of things that have an impact on whether an HDR monitor is worth it.
There are some HDR monitors that offer awful HDR picture quality but are worth the money despite that since they offer good other specifications.
In contrast, other HDR monitors might offer a brilliant HDR picture quality but have other panel-related flaws.
So, always check for monitor reviews of the displays you’re interested in to get the information you need.