High Dynamic Range (HDR-High Dynamic Range) has three main formats: HDR10, HDR10 + and Dolby Vision (Dolby Vision), three different HDR formats, each with its own advantages.

When buying a new display with HDR function, if the requirements are not high, you don't have to worry about which format it supports, because they can all provide a better visual experience than ordinary displays through compatible displays. If you have higher requirements for visual performance and are entangled in not knowing which HDR format is more suitable for you, the following are the differences and advantages and disadvantages of the three different HDRs. Reading to the end, I hope you can find the HDR format that suits you. HDR10 Open standard HDR10+ royalty-free standard Dolby Vision·Copyright Standard Bit depth HDR10 10bit HDR10+ 10bit Dolby Vision 12bit

In the computer, the bit depth is actually expressed by the bit depth required by each pixel. The reason why the display can display colors is that a unit called "bit" is used to record the data of the indicated color. When these data are recorded in the display according to a certain arrangement, they constitute a digital image. The traditional SDR (Soft Ddefinition Radio) standard is usually 8bit, that is, the display color is 16.7 million units. In HDR, the most basic color bit depth is 10bit, which means 1.07 billion color units. The higher the display bit depth, the more color details displayed, the more realistic the image, and the more natural color transitions. There are three different HDR formats. Dolby's visual color depth has reached 12bit. HDR10 and HDR10+ are both 10bit. There are also formats up to 32bit, but they are almost not circulated in the consumer market. Although Dolby's visual color depth is the highest, most displays on the advanced market only support up to 10bit. However, with the iteration of raw materials, it is believed that displays that support 12bit will soon be put on the market. Winner: Dolby Vision Peak brightness (Mastered control range) HDR10 1K-4Kcd/m² HDR10+ 1K-4Kcd/m² Dolby Vision·Standard 4Kcd/m², up to 10Kcd/m² cd/m²: Candela per square meter, refers to the unit luminous intensity per square meter of the display (1 nit = 1cd/m²).

At present, the main frequency brightness of most Dolby Vision content on the market is 4000cd/m², while HDR and HDR10+ display main frequency brightness hovering within 1000-4000 depending on the manufacturer. The three formats can meet the image requirements of up to 10,000 cd/m², but there is no consumer-grade display on the market that can reach this level, so at this point, the difference between the three is not very big.
Winner: Dolby Vision
Compared with the other two formats, Dolby Vision’s format is more unified and standardized, but the cost is slightly higher. HDR10/HDR10+ can also reach the standard of 4000cd/m², but considering the actual experience, the gap is not very big. To
Tone mapping
HDR10·PQ mapping
HDR10+·PQ mapping
Dolby Vision·Dolby Chip + PQ Mapping

For an image, the dynamic range of the image refers to the ratio of the maximum brightness of the image to the minimum brightness. The high dynamic image refers to an image with a large dynamic range. For high dynamic images, the number of bits is often higher than 8 bits (normal gray image bits are generally 8 bits), and the gray scale of the display is only 8 bits, so the color of the high dynamic image must be changed to display. In addition, the gray value distribution of high dynamic images is very uneven, only a few pixels are brighter, so if you directly normalize the image (map the maximum gray value to 255 and the minimum value to 0) display. Tone mapping was born to solve this problem.
If there is a monitor with a maximum brightness of 1400 cd/m², how does it handle the highlights in movies up to 4000 cd/m²? That is, when the peak brightness of the display is low, how to deal with the display content with higher peak brightness?

The traditional method is cropping. If the maximum peak brightness of the display is only 1400 nits and the maximum peak brightness of the image content is 4000 nits, then the display will automatically crop the brightness higher than 1400 nits, which means that it will not be visible in the highlights. To any details, and the color transition in this area is extremely blurry.
Another method is tone mapping. On a display with 1400 nits, the highlights from 1400 nits to 4000 nits will be remapped to below 1400 nits. Although the color will be somewhat dim in the highlights below 1000 nits, compared with the highlight display, the image must be able to show more details in the highlights through the method of tone mapping processing.
In displays that support the Dolby Vision system, a proprietary Dolby chip is generally equipped, which can apply the tone mapping value through the peak brightness value of the display. There are also some manufacturers that use Dolby’s authorization to implement such functions with software. As for HDR10/HDR10+, it varies from manufacturer to manufacturer, which leads to uneven technical specifications.
Winner: Dolby Vision
Metadata

Metadata refers to “intermediary data” or “data package”, and can also be understood as a “data directory”, usually a metadata records multiple specific data information. A movie or a frame of image contains multiple metadata information. The three HDR formats are different in metadata collection and processing technology. HDR10 only requires static metadata, and only needs to obtain the highest peak brightness of the image to determine the dynamic range of the entire image. Dolby Vision and HDR10+ use the acquisition of dynamic metadata, which enables the central processing unit to apply tone mapping frame by frame or a certain segment of the image scene.
Winner: Dolby Vision and HDR10+, which can apply images of different light scenes.
application
TV

Three different HDR standards. With the iteration of TV display technology and the improvement of consumer quality requirements, most high-end TVs support in addition to HDR10, but also support Dolby Vision or HDR10+. If the TV itself does not contain a Dolby chip, it will not be effective even if the external signal source is used. Most of the TVs of some big brands such as Sony, LG, Hisense and TCL support HDR10 and Dolby Vision. Due to the liberalization of technology, Samsung chose to invest heavily in research and development on HDR10+, and its branded TVs or mobile phones do not support Dolby Vision. There are also some extreme brand TVs that support HDR of three different formats at the same time, such as Panasonic.
Cell phone

Although mobile phones that support HDR are very rare, most mainstream brand mobile phones will support one or two HDR formats. Since iPhone 8, Apple has supported Dolby Vision, but only Iphone X, XS, XSMax and the latest 11pro have real HDR screen.
Game

The original intention of HDR was to allow viewers sitting in front of the TV to have a better viewing experience, but with the development of video games, some high-quality game consoles support HDR10, such as PS4/PS4 Pro/ Xbox One, Xbox One And Sony’s new-generation game console PS5 both support Dolby Vision. The Dolby Vision system used on the game console is different from the normal system. It uses low-latency technology so that there will be no additional game delay during the process of turning on HDR.
Winner: HDR10
Summary
Through the above parameter analysis and application situation, Dolby Vision is undoubtedly the first choice for users pursuing ultra-high visual experience, but considering that you get what you pay for, the price of displays with Dolby Vision will also be higher. From an application point of view, Dolby Vision mainly supports high-end TV/cinema, etc. If you want to experience the difference from the other two formats, it requires higher external conditions. Usually Dolby Vision will play a role on home TVs. Its better effect. In daily scene use, if the display used is more ordinary, the visual experience of the three standards is almost the same.