Which type or standard of HDR is better for a monitor

There are variety of HDR formats, noting that this diversity could be seen as both an advantage and a disadvantage. While some may appreciate the ability to fine-tune their experiences, others might find it overwhelming.

Today, I aim to explore the most popular HDR formats, find their differences and determine which HDR is best for a monitor.

Which standard of HDR is better for a Screen

Basic HDR Standards

When discussing the primary HDR standards, four types typically come to the forefront: HDR10 and HDR10+, HLG, and Dolby Vision. Most brands focus these standards, so our initial step is to examine how HDR varies between them. This will help us find out which standard is superior.

1. HDR10

The HDR10 standard, introduced over a decade ago, continues to be widely used today, largely because it’s user-friendly and requires no special licensing. Video content developed for HDR10 must comply with UHDTV Rec. ITU-R BT. 2020. And since this standard is based on the PQ EOTF transmission function, it is not compatible with SDR monitors. The “10” in HDR10 signifies the 10 bits of color encoding it uses, offering a broader spectrum of shades than the 8 bits per color used in SDR, thus producing richer and more diverse visuals.

HDR10 operates with a single layer of video content, relying on static metadata—Mastering Display Color Volume and Content Light Level Information.

The first specifies display parameters, dictating color representation and their spectrum range; the second conveys information about peak display brightness and average frame brightness. Due to the focus on static images, it is important that content creators themselves take care to use HDR correctly, since the screen brightness will always be the same. This may distort the image in very lit or dark scenes.

Many streaming platforms, including Netflix, Amazon, and YouTube, default to using HDR10 for their HDR content, though they are compatible with various HDR standards. Following HDR10, HDR10+ emerged, enhancing the viewing experience by adhering to the same video content recommendations but not supporting SDR displays.

The key difference with HDR10+ lies in its use of dynamic metadata, enabling more precise scene-by-scene lighting adjustments. This feature significantly enhances the visual experience by offering a broader spectrum of shades and ensuring optimal brightness for both light and dark scenes. Additionally, HDR10+ provides the flexibility to customize these settings, further elevating the advantages of this HDR format.

HDR10+ is designed to be backward compatible with HDR10, ensuring that if a display supports HDR10 but not HDR10+, content tailored for HDR10+ can still be displayed using the simpler HDR10 standard. This format is mainly used only by the companies that took part in the development – Samsung and Panasonic, showcasing their commitment to advancing viewing experiences.

2. Dolby Vision

The most complex and superior HDR standard, developed by Dolby, is known as Dolby Vision. It uniquely uses two layers in a single video file: a fundamental Base layer (BL) and an advanced Enhancement layer (EL). However, in practice this is not so easy to do, since two layers not only complicate the content’s preparation and playback but also significantly increase the size of the final video file, making it quite heavy.

Therefore, the Dolby Vision standard has 5 subprofiles: 4, 5, 7, 8 (with variations 8.1 and 8.4), and 9. Each subprofile offers slight improvements over its predecessor, focusing on minor details to enhance performance.

It’s important to note that none of these subprofiles are compatible with Standard Dynamic Range (SDR) as they all utilize PQ EOTF technology—subprofiles 5, 8, and 9 for the base layer, and 4 and 7 for the enhanced layer.

All profiles use dynamic metadata, like HDR10+. Therefore, you definitely don’t have to worry about the quality of the image; all colors and highlights will be conveyed as intended by the designers and artists. The difference from HDR10+ is its closed source code and exclusive technologies, which, according to the manufacturer, use dynamic metadata better. Many film studios today make films and TV series with Dolby Vision in mind.

3. HLG

Developed by LG in collaboration with the British BBC, the Hybrid Log-Gamma (HLG) standard was released in 2015. It is usually compared to HDR10, since they came out around the same time with a slight difference, and are focused on one layer of video content.

However, HLG stands out as it supports video that complies with the BT 2020 specification, without relying on metadata. Instead, HLG utilizes a hybrid logarithmic function, known as HLG EOTF, to achieve its results.

Its main feature is that it partially follows the curve of both SDR and HDR. This enables HLG-adapted content to be seamlessly played back on displays supporting HDR10, HDR10+, Dolby Vision, and BT.2020-compatible SDR.

However, this HDR technology approach comes with certain limitations on it, so a high degree of realism, as in the case of HDR10+, cannot be achieved. There will be slight distortion in colors, especially in bright scenes and where there are a lot of mirror surfaces and reflections.

Other standards

If you look for information about other standards, you will come across either their low popularity, or the fact that some “other formats” are new names for already familiar ones. For example, LG HDR Pro refers to LG’s range of TVs supporting HDR10. Although this name can be found in lists of supposedly additional formats. By the way, it has now been changed to HLG and yes, this is the same format that I talked about above.

HDR4000 is the name of the generation of Samsung TVs that use HDR technology. There were 1000, 2000, and 3000. QHDR is also regular HDR10, but for QLED screens from the same Samsung. And there is also Q HDR Elite, Q HDR EliteMax, further illustrating the trend of rebranding within the industry.

Technicolor has Advanced HDR, so if you want to know which is better, HDR or Advanced HDR, know that the difference is small. However, brands like the Chinese TCL offer their unique version, dubbed TCL HDR Pro Gamma. And I can go on like this all day, literally every brand has its own name for this technology, which can also change from line to line.

Yet, there are lesser-known standards, such as RQ10, which is purported to be akin to HDR10 but doesn’t utilize metadata. Personally, I haven’t encountered TVs or monitors that support it. Generally, the mainstream focus is on HDR10 and HDR10+; for detailed clarity on these standards, it’s advisable to ask or see from the manufacturer’s official website when selecting a TV or monitor to connect with PC.

Differences between standards

It’s clear that the primary differences among standards lie in the implementation of HDR technology and the manufacturers involved. Although some brands, including Samsung, Dolby, LG, and the BBC, have introduced their unique approaches to content processing and screen updating.

Technically, the differences primarily revolve around the use or absence of metadata (whether static or dynamic), the standards adopted, and the number of layers for image processing.

Conclusion

As you can see, despite the fact that HDR technology represents a new step towards photographic image accuracy for a TV or monitor, there are difficulties with it. In the material where we looked at the technology as a whole, I already said that there are many “buts” that can distance you from the picture quality declared by the manufacturer. Moreover, it’s important to recognize that the implementation of HDR technology can vary greatly, a fact that becomes evident upon closer examination.

As for the question of what is better standard for a TV or monitor, it is quite possible to give a clear answer – Dolby Vision, because it will provide the best picture. However, its limitations mean other options should not be entirely disregarded.

I understand that this overview of HDR technology may not simplify the process of selecting a TV or gaming monitor. Nonetheless, I hope my concise explanations will ease your search somewhat. While there is definitely no one best option here, they all have advantages, disadvantages and “highlights”.

[Related]

Spread the Insights of this Post!