Distinct HDR "standards" - High-Def Digest Forums
Reply
 
Thread Tools Search this Thread Display Modes
  #1  
Old 01-12-2018, 07:57 AM
jgslima's Avatar
Junior Member
Thread Starter
 
Join Date: May 2009
Posts: 22
Default Distinct HDR "standards"

I'm getting confused, and a little afraid, with all these distinct HDR implementations (HDR10, Dolby Vision, HDR10+, HLG, not sure if any other).

Hope someone can help clarifying this.


My first question is: when authoring an UHD blu-ray, does the disc distributor have to choose only one of the implementations to put in the disc? Or can a single movie have both HDR10 and DV? If the movie can contain multiple, how the player chooses the one to be used?


Also, I really would like to know what happens when you watch a source with a HDR implementation not compatible with the equipment.

For instance: in my particular case, all my chain supports HDR but not Dolby Vision or HDR10+. Therefore, if I purchase a movie that has DV:
  • will my UHD blu-ray player at least be able to read the disc and send some signal to the chain? Will it remove (actually, ignore) the DV metadada before sending it? If so, will I loose WCG as well?
  • if in the future I have a player that actually supports DV (but the receiver is not yet upgraded) what will happen? Should the player, through HDMI handshaking, see that the receiver is no capable, and then will remove the DV metadata before sending it?
  • finally, if my player and receiver are upgraded, but my projector is not, what will happen?
Reply With Quote
  #2  
Old 01-13-2018, 03:56 AM
Shadow of Death's Avatar
Senior Member
 
Join Date: Jun 2011
Posts: 1,582
Default

I dunno about the other standards, but I know that titles with Dolby Vision will display standard HDR (on TVs that support it of course), if the set or player doesn't support Dolby Vision. So one would think any other standards could exist on-disc.
Reply With Quote
  #3  
Old 01-13-2018, 01:53 PM
mattedscreen's Avatar
Senior Member
 
Join Date: Oct 2010
Posts: 2,939
Default

As far as I'm aware, standard HDR10 or "HDR" is built into virtually every disc with very few exceptions. It's a basic spec so that people regardless of setup will be able to view the disc so if you're not DV or HDR10+ ready you'll at the very least get an HDR experience.
__________________
Need something fun &/OR funny? Watch This!
Reply With Quote
  #4  
Old 01-13-2018, 03:17 PM
eNoize's Avatar
Senior Member
 
Join Date: Nov 2006
Posts: 5,605
Default

The new format wars are indeed somewhat confusing and needlessly frustrating, and yeah, there are four HDR standards in the works. Of them, the SL-HDR1 format is the least known and arguably most confusing because it sends an HDR signal by piggy-backing on an already available SDR video stream. The HLG (Hybrid-Log Gamma) format created by BBC and NHK is for live broadcasts in 4K, delivering higher dynamic range but in the traditional Rec.709 color space, despite being a 10-bit color depth signal. These two are arguably the least you need to worry about since neither are taking part in the format war because their implementation is only concerned with broadcast television.

In the most simplest terms, the difference between HDR10 and Dolby Vision is that the former sends static metadata in 10-bit color depth at 1,000 nits while the latter sends a more dynamic signal capable of up to 12-bit depth at 10,000 nits although the Ultra HD disc right now maxes out at 4,000 nits and encodes are mastered at only 1,000 nits. Both currently use the SMPTE ST 2086 color grading standard, but HDR10 is also limited to that while DV promises the SMPTE ST 2094 standard in the future, which translates to 68 billion colors. Without getting too technical, static metadata essentially means someone inputs specific HDR grading values and simply flicks a switch, but dynamic metadata means those values are changed and adjusted on a scene to scene basis. That's where HDR10+ comes in, promising dynamic metadata in 10-bit color depth at 1,000 nits while also claiming to be ready for the SMPTE ST 2094 standard. And despite all that, Dolby Vision still offers more and is ultimately future proof, probably in the next six or so years when displays can actually do up 12-bit and 4,000 nits.

On Ultra HD disc, the choice to use HDR10, HDR10+ or Dolby Vision is a decision made entirely by the studio based on whether or not they want to pay the extra licensing fee. DV charges per disc, player and display whereas HDR10 and HDR10+ are an open royalty-free format, which introduces a host of other problems such as studios implementing it differently from one another rather than abiding by a set mastering and grading standard like DV. All UHD discs come in HDR10, so if neither player nor display are HDR10+ or DV compatable, you can still enjoy the movie in HDR, and the HDR10+ signal is piggy-backing on that. Dolby Vision is simply another layer of metadata atop the already existing HDR signal, meaning that at this time in the early stages of the technology, unfortunately, DV is constrained by the limitations of HDR10 such as 10-bit depth and at best, 1,000 nits.

This means the difference between the formats is arguably small, offering very subtle improvements in the color grading and very slightly silkier blacks, depending on the movie, of course. And this is also the reason why many have said they don't see any real significant difference as well. It's all understandable since the most that current displays can do at the moment is around 700-800 nits, except for the expensive, high-end Sony A1E or Z9D (I think, not sure) and the LG W7P, and limited to 10-bit color depth. Then, we also have to take into consideration screen size and seating distance because to better appreciate a 4K HDR presentation, be it in HDR10+ or Dolby Vision, you should reasonably be sitting closer to the display. With a 65" screen, which I believe is currently the most popular size, you should ideally be seated at around 5 feet away and a smaller TV means sitting even closer, but no one really does that. In the HD SDR world, you could sit up to 8 feet away from the same size screen, but the average distance of most consumers is still at 9 feet away. To me, this explains why viewers believe 'Batman Begins,' 'Iron Man (German Import),' 'Terminator 2' and 'Men in Black' look excellent on Ultra HD. They're sitting too far away to notice the image is not that great.

But to answer the rest of your questions, the player decides which format to read based on its own capabilities and that of the TV's capabilities. Since all UHD discs come with the HDR10 metadata, the player will automatically default to that format without doing anything to the color space because this signal is always the base signal. With HDR10+ and Dolby Vision acting as another layer of metadata on top of that one, a player capable of reading that information is automatically switch to it. The LG UP970 favors the DV signal and will output in that format without the option of turning it off, but if your display is only capable with HDR10, then the player will go with that signal. Personally, I don't care for the LG because it also forces upscaling of Blu-ray discs to 4K, and in my opinion, this could ruin an assessment of the HD SDR presentation's quality. The Oppo 203, on the other hand, gives viewers the option of turning off the DV signal as well as upscaling to 4K, which is fantastic for making comparisons. In either case, you don't lose the WCG in the UHD since all formats and TVs are limited to 10-bit depth. If you upgrade your display with DV compatibility but your receiver is only HDR10, the player will ignore the second layer of metadata and only send the HDR10 signal. In this case, you'll want to use both HDMI outputs of the player, "Audio Only" to the receiver and video directly to the TV.

I am currently running three players -- five if you also count the PS4 (1080p reviews) and Xbox One X -- to a TV and projector, both of which are 4K HDR10. One runs directly to the TV for Dolby Vision with audio to the receiver. Another is connected to the receiver, which then runs to the projector. When watching a UHD disc with Dolby Vision HDR through the projector, the player automatically defaults to the HDR10 base metadata, and like I mentioned above, I don't lose the HDR or WCG. It is still a beautiful picture. On the other hand, if you're running an HD SDR projector, you do lose the UHD and WCG or the player will tell you it's not compatible and might refuse to play the disc. In fact, when using the LG player, the picture looks dull and flat, absolutely horrible. And that's the beauty of the Oppo player where you can shut off the HDR signal, matching the max nits of the projector, and still enjoy the higher audio resolution with slightly improved colors even though it's still limited by the Rec.709 color space.

Apologies for the incredibly lengthy explanation, but I hope that helps.
__________________
M. Enois Duarte
High-Def Digest Contributor
Hi-Def Collection
Unfollow me at twitter: @MEnoisDuarte

Movies are so rarely great art that if we cannot appreciate great trash
we have very little reason to be interested in them.

~ Pauline Kael
.

Last edited by eNoize; 01-13-2018 at 03:28 PM. Reason: misspelling
Reply With Quote
  #5  
Old 01-13-2018, 05:58 PM
jgslima's Avatar
Junior Member
Thread Starter
 
Join Date: May 2009
Posts: 22
Default

That's great Enois, I appreciate the time you dedicated to this.

Quote:
Originally Posted by eNoize View Post
all UHD discs come with the HDR10 metadata
This is a very, very important point. With this being true, I think we can assume that every display (and maybe, every receiver) that ever supports some kind of HDR, will support HDR10.

For me this demystifies the statement that there is no standard at all, and that a supposed lack of standards will kill 4K. I feel a little more covered as a consumer by knowing this.

This remembers me when Dolby True HD and DTS-HD Master Audio were created, when they defined that discs with them should also have a legacy Dolby Digital or DTS score.
Reply With Quote
  #6  
Old 01-13-2018, 10:17 PM
eNoize's Avatar
Senior Member
 
Join Date: Nov 2006
Posts: 5,605
Default

Yeah, that's one way of thinking about it, which is also similar to Dolby Atmos and DTS:X containing a Dolby TrueHD and DTS-HD MA base for older receivers.
__________________
M. Enois Duarte
High-Def Digest Contributor
Hi-Def Collection
Unfollow me at twitter: @MEnoisDuarte

Movies are so rarely great art that if we cannot appreciate great trash
we have very little reason to be interested in them.

~ Pauline Kael
.
Reply With Quote
Reply

Related Topics
Thread Thread Starter Forum Replies Last Post
CES 2017: Sony Unveils BRAVIA A1E Series 4K HDR OLED TV with Dolby Vision HDR scohen Home Theater Gear 2 03-14-2017 05:10 PM
DECE: Industry Unites to Create Internet Video Download Standards mikemorel High Definition Smackdown 28 09-16-2008 01:17 AM
Foreign Blu-Ray's: "Monster", "Copland", "Traffic", "The Machinist", "ZZ Top Live"... Arkadin Blu-ray Software General Discussion 2 04-17-2008 07:25 PM
Jazz Standards & The Way To Paradise on HD DVD diabolo HD DVD Software General Discussion 9 10-22-2007 01:32 AM
Why diffrent standards for PQ and SQ? enigma High Definition Smackdown 5 02-11-2007 04:56 PM


Tags
dolby vision, hdr, hdr10+

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off