Like Dolby Vision, HDR10+ uses “dynamic metadata” that’s encoded into scenes ahead of time, as opposed to the fixed metadata in HDR10. That’ll allow an HDR10+ TV to adjust brightness on a “scene-by-scene or even frame-by-frame basis,” Samsung says. For instance, with HDR10, dark scenes in a generally bright movie may look “significantly darker than was originally envisioned by the director,” it adds. The new tech will adjust for that on the fly, making films look more as their creators intended.
That sounds very similar to how Dolby describes its tech, and as with Dolby Vision, HDR10+ metadata will have to be baked into content before TVs can decode it. As such, Samsung has teamed with Colorfront to incorporate HDR10+ mastering into its “Transkoder” systems used by post-production houses. It also worked with MulticoreWare to integrate HDR10+ into the newish x265 high efficiency video coding (HEVC) codec used by UltraHD Blu-ray, Netflix and satellite and terrestrial broadcasters.
While Dolby charges royalties for its tech, HDR10+ is an open standard, so it can be adopted by other TV manufacturers for free. However, HDR10+ still lacks some of Dolby Vision’s features, in particular its wider 12-bit color range and maximum 10,000 nit brightness — both features aimed at future TVs. Dolby Vision also works with the older HDMI 1.4a standard, while HDR10 requires HDMI 2.0. Dolby Vision is backwardly compatible to HDR10, but it’s not clear if it will work with the new standard.
Samsung has avoided Dolby Vision, but most other manufacturers including Sony and TCL have opted in. Dolby also has deals with Warner Bros., MGM, Universal and other studios to use its encoding tech. At the same time, Amazon, one of the main consumer streaming companies, has committed to start broadcasting HDR10+ globally “later this year,” it said. In other words, this format rivalry probably isn’t going away anytime soon.