If they were 1080p I’d totally be in for 3
300 nits of brightness does not equal HDR. True HDR requires 1000 nits of brightness.
Can anyone confirm this monitor supports FreeSync w/ 4k over HDMI? A lot of monitors state they have FreeSync but a majority do not support them over HDMI.
Some important facts for gaming:
This monitor has 14 ms for average response time and 23 ms of input lag. To compare this gaming monitor to a TCL 6 Series 55" television for gaming (one of the top gaming TVs for 4k) you will see that the TCL 6 series boasts 12ms of average response time and 18ms of input lag. This makes the TCL 6 series TV better by 2 ms of response time and 5 ms at input delay. 4k monitors haven’t really caught up to the spec sheets of 4k televisions.
So for true 4k gaming, televisions are still better and you also get a lot more inches per dollar.
Would you please site your source about what is true HDR.
I found this from the International Telecommunications Union:
“Standard dynamic range TVs generally produce 300 to 500 nits.”
The report on standards goes on to site baseline models and it gets confusing, but basically there is no, so called standard, and certainly nothing called True HDR.
In April 2016, the UHD Alliance — an industry group made up of companies like Samsung, LG, Sony, Panasonic, Dolby, and many others — announced the Ultra HD Premium certification for UHD Blu-ray players. This benchmark sets some baseline goals for HDR, such as the ability to display up to 1,000 nits of peak brightness and a minimum 10-bit color depth.
These are well under that in both color depth and brightness.
Doing some quick research, the average nits for gaming monitors is 229.
Heh. The “zoom” function on this site does weird things when the images have transparent backgrounds.