Samsung has been caught cheating in TV benchmarks, according to recent reports by FlatPanelsHD and HDTVTest’s Vincent Teoh that discuss and demonstrate how the company has designed its TVs to “recognize and react to test patterns used by reviewers.” Among the cheats that Samsung has seemingly engineered include those discovered in the QN95B Neo QLED 4K Smart TV, which reportedly “changes its color and luminance tracking during measurements to appear very accurate” and “boosts peak brightness momentarily by up to 80%, from approx. 1,300 nits to 2,300 nits.” FlatPanelsHD noted in its coverage that it had reviewed the set but found that it never surpassed 1,300 nits with real content. Samsung has promised a software update.
“Samsung remains committed to relentless innovation to provide the best picture quality to our consumers,” wrote in a statement to FlatpanelsHD. “To provide a more dynamic viewing experience for the consumers, Samsung will provide a software update that ensures consistent brightness of HDR contents across a wider range of window size beyond the industry standard.”
“The update for S95B has been conducted, and the update for QN95B will be provided soon,” the company added.
Reviewers, calibrators and certification bodies typically use a 10% window for HDR testing, which simply means that it takes up 10% of the screen. In this window multiple steps from black to white as well as a set of colors are measured. Samsung has designed its TVs to recognize this and other commonly used window sizes, after which the TV adjusts its picture output to make measurements appear more accurate than the picture really is. When using a non-standard window such as 9% (everything else equal), the cheating algorithm can be bypassed so the TV reveals its true colors.
Source: FlatPanelsHD
And as I'd suspected that this applied to Samsung's LCD panels... but nope, suspected wrong, this is their brand-new QLED tech.
Back to lusting over a 77" LG panel for the living room :cautious:
I do agree with you about the text on it but I can't do justice to how great games look on it. 4K/120 Hz G-Sync/12-bit HDR. I was playing Metro Exodus on it the other day and I do take moments just to look around.
And that's all LG is doing to get higher brightness. They're pumping more power through the panel. That's going to result in faster organic decay and burn in.
All the objective data I’ve seen tends to point to the C2 as the better screen. Here is just one example:
Samsung QD-OLED TV first test results — how it stacks up so far [Update]
That said - numbers aren’t everything and screens are highly subjective: you like what you like.
I have a C9 and love it but at 65" it's the living room tv. For gaming its brightness has never really been an issue but for media consumption that's another story. Meanwhile, I also have the CRG9 which is just a tad too big at 49" but also doesn't really compare to the IQ I see on the C9. I'm also still on the fence in trying to decide if a 42" C2 is too big. For me, the sweet spot is around 38"-40".
Ultimately I'll probably wait for something else after Alienware/Samsung because the general consensus is that things will only get better as more models get released. I also found it a bit discerning the graph similarities between HU's testing and the ones I saw from Teoh for the other Samsung panels. Not a good time to be seeing such a thing. Also not happy that HU's review mentioned the cooling fan for the Alienware is fairly audible. I'm really not into having a display with a cooling fan on my desktop.
I finally got my main gaming rig set up the way I want but now back to the seemingly never-ending quest for a display that checks all the boxes I'm looking at. Ugh, I know it's first-world problems but so tired of looking. I do like that both of these are in the $1300-$1400 range which, while still high, is far better than the $2K-$3K other similar things are going for. It also seems like display manufacturers are all having some kind of QA issues when it comes to things in this tier bracket. I couldn't tell you how many reviews (MSI/ASUS ROG/Samsung/LG) I've read in the last 6 months of things I thought looked good only to find out about some quirk in the design that people found out about after buying them.
I'm a fan of Alienware for this reason, though the reasoning may not apply to readers outside of NA. Still, there's significant confidence here that I just don't have in most other vendors.
I didn't think the VA panel in the 32" 1440p LG monitor I have could be that bad, with reviews basically saying that the text issues could be overcome and that the monitor had pretty good color.
I've had to run the monitor at 125% scaling to 'overcome' the text issues, and after years of trying, I've yet to be able to consistently calibrate it. Now, this isn't a Samsung "VA", but sourced from someone else, and perhaps that should have been the warning sign. It does have better contrast on static scenes than any IPS and doesn't look that bad, but aside from contrast an even older 27" 1440p IPS panel blows it out of the water.
I'd challenge that. Not for the purpose of completely refuting the claim, but rather to point out that the differences don't really seem to measure up one way or the other, and so 'best' will fall into the subjective spectrum.
So, I'm sticking with my AW3821UW with its IPS panel. 21:10, 38", sharp text, great color, great response times, and enough contrast.
As important as gaming is, so is being able to work, and I've found this size and aspect ratio to actually be optimal for both, for me.
Right now if I want to game on the C9 I have to move a table and then drag a recliner in front of it so that's been a major reason for this. I'm really looking forward to just being able to turn on the computer and play. Not really worried about burn-in since I know all the tricks for prevention and I don't normally leave anything static on for long periods. The next trick will be to see if I can find a good 8 to 10-foot-ish HDMI 2.1 cable that really lives up to its specs. I've tested some in recent years and they don't always and the full 48 Gbps bandwidth is needed.
If they can make the screens accurate during a test, what's stopping them from maintaining that level of accuracy all the time?