TV standards are still a mess

I picked up a Vizio PQ65-F1 4K TV during the holiday sale season, and it’s a tremendous upgrade from our old LCD, an almost 10-year-old Sharp. The color and dynamic range are amazing and I don’t regret skipping OLED at all. I haven’t played with the built-in TV app platform at all, preferring instead to use an Apple TV 4K as the sole source device.

Sadly, TV’s continue to be anything but plug-and-play. For some reason the HDMI2 input does not work with Dolby Vision HDR (but works fine with HDR10). As far as I can tell this is not a limitation of the TV itself, so it must be a manufacturing defect. I wasted a lot of time isolating this problem. The TV itself has five HDMI inputs, and they’re not differentiated except that HDMI4 & HDMI5 require a minimum 1080P input, and HDMI1 supports the HDMI audio return channel (ARC).

Our Onkyo TX-NR545 receiver supports the latest HDMI/HDCP standards and HDR10 pass-through, but for some reason it cannot pass through Dolby Vision signals. This seems to just be a firmware limitation, but it’s an EOL product and Onkyo’s solution is to buy a new device. A workaround is to use HDMI ARC to pass audio from the TV to the receiver, and connect source devices directly to the TV. That works fine, but prevents using the latest audio formats (e.g., Dolby Atmos).

I’m still using a 3.0 audio setup, so I don’t care about Atmos right now, but it’s obnoxious to have to choose between having the latest audio standards and having the latest video standards (or buying a new receiver). There are some HDMI splitters (such as HDFury) that could work, but they’re 1/3 or more the price of a new receiver.

The TV works well as a dumb monitor; the Internet connection is completely optional. If connected to a network, it will update its firmware without asking. My current plan is to leave it on the network for home automation and Chromecast support, but to block its Internet access at the router.