HDR? Even more brightness (10x higher brightness levels than current LDR TVs)? @o@
But most TVs are already excessvely bright, especially if you watch your media in a dark[ened] room.
Do you *really* want to experience being blinded by really bright light as in real life?
7000 nits is now the absolute minimum a TV should have to pass as "HDR-ready".
They did a similar thing with "HDR" audio. New high-end DACs with over 130dB of dynamic range. But to really tell if there's any diffrence between these new DACs and a standard DAC, you'll have to turn up the volume to "insane" levels (130 dB SPL / the threshold of pain) and permanently damage your hearing in the process.
110dB is more than enough even for the most sensitive (or damaged) ears, because no sane person would turn up the volume above that
The good thing about these new HDR TVs is the potentially lower power consumption at minimum brightness / contrast settings (and maybe higher backlight lifetimes because of this)
What most people need now is very low black levels (nearly infinite contrast like a CRT) *and* efficient screen polarizers (this is just as important, if not more than the black levels), so the black remains pitch-black at any room brightness level and doesn't look grey or washed-out like a CRT.
8-bit color [with dithering] is also more than enough for most people (no banding at all).
What we need is not more bits (12-bit). How about improving the quality of streams / updating the standard to finally make full use of the already available bits (4:4:4 instead of 4:2:0).
+1 for higher bitrates for streamed content. When 1080i broacasts look like 480i it's not even funny. It's also sad to see that the 480p SD version of the same stream when post-processed by the TV's enhancements looks better than 720p (HD).
What I also want to see is *affordable* TVs with higher REAL refresh rates than 60 Hz (e.g. 100Hz or 120 Hz) and an option to enable enhancements even in PC mode.
Most (all?) TVs neither have support nor have an option to switch to the sRGB colorspace in 'Game' or 'TV' mode.
About fixed-function hardware video acceleration (DXVA):
HW accelerated playback is unreliable (too many issues, especially with web browsers, high framerate videos and AMD drivers). Many things break after a GPU driver update, it gets out of date quickly as new additions to the spec are released and more demanding levels / higher bitrates / framerates are used.
There's also the issue with fixed quality (read: lower quality) scaling and color reproduction when using DXVA.
And finally, HW acceleration often gets broken when the GPU / drivers reach a legacy status.
The only advantage hardware acceleration has over SW playback is low power consumption.
Software playback wins hands down.
The optimal choice for UHD BR playback would be a multicore CPU (8 cores minimum) with high IPC and an integrated GPU with HBM2
and full next-gen HSA support... so we're still waiting for that 2017 AMD Zen
APU and the next-gen Intel Core iX CPU with 8 cores.
Is VP9 acceleration *that* important? Aside from YouTube and a few Linux users, nobody else cares about VP9. Almost every person and his dog who cares about small filesizes and image quality uses H.265/HEVC for anything above 1080p (4K/5K/8K).