I'm with you on depth of field, but I don't understand why you think HDR reduces the fidelity of a game.
If you have a good display (eg an OLED) then the brights are brighter and simultaneously there is more detail in the blacks. Why do you think that is worse than SDR?
That's not what it means since 2016 or so when consumer TVs got support for properly displaying brighter whites and colors.
It definitely adds detail now, and for the last 8-9 years.
Though consumer TVs obviously still fall short of being as bright at peak as the real world. (We'll probably never want our TV to burn out our vision like the sun, though, but probably hitting highs at least in the 1-2000nit range vs the 500-700 that a lot peak at right now would be nice for most uses.
Right. Just like the article, HDR is too vague to mean anything specific and a label that's slapped onto products. In gaming, it often meant they were finally simulating light and exposure separately--clipping highlights that would have previously been shown. In their opinion, reducing the fidelity. Same with depth of field blurring things that used to not have blur.
It's HDR at the world data level, but SDR at the rendering level. It's simulating the way film cannot handle real-life high dynamic range and clips it instead of compressing it like "HDR" in photography.
> Instead of compressing it like "HDR" in photography
That's not HDR either, that's tone mapping to SDR. The entire point of HDR is that you don't need to compress it because your display can actually make use of the extra bits of information. Most modern phones take true HDR pictures that look great on an HDR display.
The “HDR” here is in the sense of “tone mapping to SDR”. Should also be said that even “H” DR displays only have a stop or two of more range, still much less than in a real-world high-contrast scenes
Displays as low as 400nits have been marketed as "HDR".
But nits are only part of the story. What really matters at the end is the range between the darkest and brightest color the display can show under the lighting conditions you want to use it as. 400 nits in a darkened room where blacks are actually black can have much more actual range than 1000nits with very bright "blacks" due to shitty display tech or excessive external illumination.
If you have a good display (eg an OLED) then the brights are brighter and simultaneously there is more detail in the blacks. Why do you think that is worse than SDR?