There is an ongoing format war between the TV display industry players: It verses about the HDR or High Dynamic Range video optimization algorithms they enable inside their products.
When you watch HDR enabled content on an HDR enhanced display, you will perceive not only a boost on contrast, but also an enhanced color pallette, including one or two magnitudes more of available colors hue when compared against SDR (Standard Dynamic Range).
In short, the resulting HDR video will stand up in exceptional quality rendering when compared against SDR standard displays.
The aim of HDR enabled content is to maintain human vision quality between the whole video process: capture, editing and post processing, streaming and displaying.
If you already seen an HDR enhanced video on a suitable TV, then you know what I am talking about: The resulting HDR high dynamic range quality increase is easily noticeable by anyone !
The two main HDR formats disputing their place in your home are HDR10 and the Dolby Vision: The HDR10 is promoted by Samsung and Sony among other manufacturers and there is no licensing fee in order to use it. On the other side of the ring we may find Dolby Vision, a proprietary HDR format, including licenses fees to be paid by manufacturers that want to include it inside their products.
HDR Video sources
Amazon and Netflix video streaming services are already streaming HDR enhanced 4k video content: When their streaming apps detect HDR format being supported on your display, they will send you either an HDR10 or Dolby vision enhanced video streaming.
Youtube is also jumping into the HDR bandwagon, as announced by Robery Kyncl, chief business officer on CES 2016.
In the following paragraphs we will expose some of each HDR format strengths and debilities.
Dolby Vision HDR format
The Dolby Vision format calls for 12 bits of information per color channel, enhancing yet more the contrast and colors pallette when compared against HDR10.
As usual with Dolby products, the whole Dolby Vision HDR process is included by their license.
The TV display manufacturer needs to include specific fine-tuning information for their hardware capabilities so the Dolby Vision processor can enhance the video in a personalized way.
In addition, also the content source needs to be mastered through the Dolby Vision algorithms. I can easily imagine that these tightly bounded processing and displaying stages controlled by the Dolby Vision process may help a lot on the resulting video quality.
HDR10 High Dynamic Range format
The HDR format uses 10 bits for color depth, which enables the dramatic enhancement that HDR calls for.
Also, the HDR10 does not require specific fine-tuning information available from the display it is being used. This fact, along with no license fee neeeded, may help the TV display industry to include HDR10 support on their HDR capable products right away.
On the other hand, the lesser bit depth and the missing fine tuning for each TV display hardware specs, will make an HDR10 enabled TV display show a less impressive HDR video output.
Which HDR format may shall I consider on my purchases ?
From the technical side, there may be a winner on this war, but as usual, such decision involves political and practical considerations into factoring.
If you are choosing a media player then you may consider a TV BOX supporting both HDR standards, as may be the case with Amlogic S912 based media players.
From the TV display side, it seems that the ones supporting Dolby Vision can also -even through future firmware updates- support HDR10 too, as technically it is a “downgrade”, which means that the hardware is already capable of HDR10, if it is prepared for Dolby Vision HDR. This is the case with some VIZIO Dolby Vision TV sets.
You can also find TV sets that already support both HDR formats from the start, as it is the case with the LG and Philips brands.