A simple fact is all HDR grading is done on static displays, without any metadata or tone mapping - dynamic, or not.
(HDR grading displays simply clip at their peak luma - no roll-off, no tone mapping, no nothing!)
So, if your home TV matches, or betters, the peak luma of the grading/mastering display there is no need for any metadata, and no tone mapping/roll-off should be applied.
And as a lot of HDR material is graded at 1000 nits peak, there are a number of HDR TVs that can match/better this.
In simple terms, HDR10 should provide the best possible match to the original artistic intent, as there is no need for any metadata based manipulation on playback at all.
If the implementations of Dolby Vision and HDR10+ are correct, they should also match the HDR10 playback, and in turn match (as well as possible based on the limitations of the specific TV) the original artistic intent.
Anyone fancy placing bets on the likelihood of this being the actual reality of how your TV implements HDR?
Mob Boss at Light Illusion