What’s the difference with VISIONPLUS, Dolby Vision and HDR10 Video?
There are some huge differences and benefits VISIONPLUS HDR can offer to the public and we are going to explain it.
- +Deep Blacks
- +Realistic Highlights
- +More Details
Actual HDR TVs are not really HDRs but SDRs. Why?
For a video to be HDR it has to be REC. 2020 Compliant but you need several requirements detailed below:
- Video Codec HEVC H.265
- 10 to 12-BITS Color Depth
- BT.2020 Matrix and Primaries
- PQ ST. 2084 or HLG * Hybrid Log Gamma *
- MaxCLL / MIN / Mastering Display metadata
The HDR video is primarily a GRADING that needs to be transferred to an extended range of color and gamma curve PQ SMPTE ST. 2084/2020 for our HDR standard.
This is how HDR10/Dolby Vision also works.
The video grading can be from any NITS peak to the maximum of the PQ SMPTE ST. 2084 which would come to be 10,000 NITS.
Currently the best high-end TVs for example Samsung’s 2019 8K HDR Line can achieve a real peak of 4000 NITS for some windows percentages in their panels.
In such a way the Blu-Ray commercial HDR vary in their type of grading and metadata:
- HDR10: 1000 to 10000 NITS peak with static metadata.
- Dolby Vision HDR: Peak of 4000 up to 10000 NITS with dynamic metadata.
Coming generation with ST. 2094 Metadata is going to add dynamic range and color to the HDR10+ and Dolby Vision.
As you can see the movies people can buy are indeed limited by their type of grading, which means they are not “pure” HDR movies.
The worst case scenario would be the HDR10 Movies with only 1000 NITS Peak Grading, pretty close to SDR standards with such low grading, as the average nits would be close to the 300 mark ( MaxFALL ). ( SDR is 100 nits peak ).
In the future when you can buy TVs with a higher peak of real NITS these films will be out of phase.
The movie studios will invent another version (OR NOT) of the movie with the highest peak of grading in NITS, which result in more economical gain for the movie studios and a bit of unpleasant to the movie lovers.
A current HDR TV is actually SDR in Steroids (Standard Dynamic Range):
- HDR native container peaks at 10,000 NITS.
- The films they sell now have a peak at 1000/4000 as explained above.
- The HDR TV’s that can be purchased today the most top of the range got real peak at 2000 NITS.
So what happens when we play a Dolby Vision movie on a TV that has a technological limit of 1000 NITS panel and the movie is 4000 NITS graded?
Basically the TV software (a chip in the case of Dolby Vision) control the NITS of the video to fit the actual panel limit. This is called TONE MAPPING.
This would have to be the most reasonable scenario for HDR10 Movies, since most peaks at 1000 NITS and a good Ultra HD HDR TV from today can actually do it at some windows percentages.
So you get the “real thing” in the output, unless you have got a panel with less than 1000 NITS, where you will still be able to watch the HDR content just not that popping out.
Today with VISIONPLUSHDR-X you can get to playback native HDR without the restrictions from the companies/movie studios.
*Our Actual Grading shows Higher Range than usual HDR video*
*Advanced Parametric Color Transform BT. 2020 PQ HDR*
*Dynamic HDR HDR10+ / Dolby Vision Tonemapping*
Explanation about the HDR Grading NITS and the PQ Format:
There appear to be a lot of confusion about the HDR “nits”. Some people confuse the term nits with the panel brightness/capabilities in regard to the HDR content. This is technically and certainly wrong.
The PQ HDR format works in an amplified gamma curve than the limited SDR 100 nit. PQ HDR works up to 10.000 instead.
Having this fact understood, the content grading should always be close to the standard maximum gamma range to be in it’s full glory.
Studios UHD blu-rays are mostly 1000 nit and some are 4000. People still don’t have 4000 nit panels but those higher nit grades just look better than the weaker 1000. Having this another fact understood, it means you don’t need the panel capability to be 1000, 2000, 4000 or 10.000 to enjoy HDR content. The panel will just show what it can, but the HDR tonemap will always be higher quality when close to the max format spec ( 10.000 ).
When colorists grade up to 1000, doesn’t mean is because most TV’s can’t show more than 1000, it means it’s just easier to grade.
1000 nit HDR content mostly lacks proper midtones and highlight, due to being so low in the PQ gamma standard.
Let say you grade SDR content, what happens if you do it in 10 nits instead 100? The video will just look pretty weak, anywhere. Same concept for PQ HDR. 1000 nits will be dramatically weaker than 10.000, but it doesn’t mean you can’t watch it anywhere, both of grades.
This is the reason we always try to go further than what studios are doing, because it’s the right way to do it.
We work hard to generate the tonemap able to show 6000 and up nit PQ Grade properly on any HDR TV of the market.
Why HDR is being used as pure marketing
TV and movie companies are making the HDR a scam and a trade. People think they buy TV’s HDR. When in fact, any panel up to a monitor can play an HDR video.
The main idea of HDR is ancient in photo edition.
They limit the PQ (HDR format) to take long term TVs and sell them as “new” giving the HDR a simple top grading.
The TVs only have the necessary nits to meet a luminance requirement of the HDR grade, the HDR is based on contrast, gamut, range and deep blacks, besides the highlights that would come to be the part where the NITS of a TV can give greater luminance.
The luminance in an HDR video with a TV that outputs more NITS, will also be noticed in the same way in an SDR video. It is understood?
The HDR TV does not exist. What does exist is the different format, where HDR TV’s can handle the PQ HDR and SDR Gamma Curves:
1- ST. 2084
3- 1886 for SDR Content
What do exist is the HDR video standard, with the help of a HDR software processing it can be “decoded” to your panels.