Looking for SAMSUNG 4K HDR 10-Bit 120Hz TV

6
Found 14th May
Price range 700-900 pounds. Size of TV 43" or higher.
Has to have 4K HDR 10-Bit and needs to be 120Hz native panel.

Any recommendations?
Community Updates
AskElectronics
6 Comments
UE55MU7000
Edited by: "rev6" 14th May
As per rev6 or the Curry's variant with a different colour bezel but basically the same stats is the UE55MU7070.

55" and above for the 10 bit panels. The 49" while good is an 8 Bit + FRC I believe.

I am keeping an eye on these new TCL panels that are releasing at the end of June. 10 Bit HDR and good value....
What do you mean by 10-bit? It can refer to several things.

If you're talking about HDR then it's not something to worry about.
Of course you want 10 bit why by tv with 8 bit or 8 bit + fcr if all modern tv’s are going to 10 you get more colours with the 10 and modern consoles like the Xbox 1 x support them.

The MU7000 is a great tv I got one in the sale before Xmas, but I’m sure you can get the 8000 series for £900 or less now...
Bargainhead2 h, 53 m ago

Of course you want 10 bit why by tv with 8 bit or 8 bit + fcr if all …Of course you want 10 bit why by tv with 8 bit or 8 bit + fcr if all modern tv’s are going to 10 you get more colours with the 10 and modern consoles like the Xbox 1 x support them. The MU7000 is a great tv I got one in the sale before Xmas, but I’m sure you can get the 8000 series for £900 or less now...


Gosh, the manufacturers are putting a bigger number on the box. Clearly that's an essential improvement to the viewing experience. Make sure you get a TV that supports the 3D too, three dimensions are 50% better than two!


You're mixing together several aspects there.

Each pixel has a value that records it's colour and brightness, the number of different values available is the bit depth (x bit being two to the power x).

The actual colour and brightness these values refer to are defined by other aspects of the picture standard, bit depth is just the number of shades available.

The more bit depth you have the close together the shades are. But the human eye can only distinguish colour and brightness differences of a certain size. So you only need enough closeness in order to make a continuous change of colour/brightness across a surface not looked stepped.

For content using existing picture standards that's 8-bit.

HDR is recorded in 10-bit because it specifies much larger brightness and colour ranges and it needs the extra bit depth to stop the shades moving away from each other.


When it comes to reproducing the picture, the previous norm has been to use a fixed backlight and let the LCD panel do all the work in creating the image by filtering the colours and brightness to the correct level. As such it's been safe to equate the number of shades in the content with the number of shades the LCD can produce.

For HDR that's not possible. It would require improvement in LCD contrast to show a larger brightness range on screen and that hasn't happened. Instead the brightness of the backlight is being varied across the screen, and each shade on the LCD maps to multiple shades in the content depending on how bright the light it's filtering is.

The other factor is that the hardware required to reproduce the full brightness and colour range of HDR is a long way off so you're only looking at 20-30% of the full 10-bit HDR brightness/colour range and a consequently reduced number of shades that need to be distinguished between within that.

Oh, plus dithering on 4K is nothing to be sniffed at when the pixels are smaller than the subpixels on a Full HD TV.


So back to planet earth for a minute, what's the practical effect of swapping an 8-bit LCD panel for a 10-bit LCD panel? It doesn't change the brightness or colour range as neither of those are determined by the LCD panel. We're talking subtle effects. Some faint banding, maybe a touch less colour accuracy. And given all the factors above, informed opinion seems to be that at the level where there's a choice even those doesn't show up.

Basically, the LCD panel bit depth should be ignored in favour of the hardware aspects that make the big differences, like colour volume.


The important 10-bit aspect is to ensure the TV can receive a 10-bit signal, in order to be able to read HDR data to make use of any HDR hardware it has. That was an issue back in 2015/2016 but for the last couple of years seem to have become ubiquitous. Although there are still models where only some of the HDMI ports will accept those signals.
EndlessWaves2 h, 10 m ago

Gosh, the manufacturers are putting a bigger number on the box. Clearly …Gosh, the manufacturers are putting a bigger number on the box. Clearly that's an essential improvement to the viewing experience. Make sure you get a TV that supports the 3D too, three dimensions are 50% better than two!You're mixing together several aspects there.Each pixel has a value that records it's colour and brightness, the number of different values available is the bit depth (x bit being two to the power x).The actual colour and brightness these values refer to are defined by other aspects of the picture standard, bit depth is just the number of shades available. The more bit depth you have the close together the shades are. But the human eye can only distinguish colour and brightness differences of a certain size. So you only need enough closeness in order to make a continuous change of colour/brightness across a surface not looked stepped. For content using existing picture standards that's 8-bit.HDR is recorded in 10-bit because it specifies much larger brightness and colour ranges and it needs the extra bit depth to stop the shades moving away from each other. When it comes to reproducing the picture, the previous norm has been to use a fixed backlight and let the LCD panel do all the work in creating the image by filtering the colours and brightness to the correct level. As such it's been safe to equate the number of shades in the content with the number of shades the LCD can produce.For HDR that's not possible. It would require improvement in LCD contrast to show a larger brightness range on screen and that hasn't happened. Instead the brightness of the backlight is being varied across the screen, and each shade on the LCD maps to multiple shades in the content depending on how bright the light it's filtering is.The other factor is that the hardware required to reproduce the full brightness and colour range of HDR is a long way off so you're only looking at 20-30% of the full 10-bit HDR brightness/colour range and a consequently reduced number of shades that need to be distinguished between within that. Oh, plus dithering on 4K is nothing to be sniffed at when the pixels are smaller than the subpixels on a Full HD TV.So back to planet earth for a minute, what's the practical effect of swapping an 8-bit LCD panel for a 10-bit LCD panel? It doesn't change the brightness or colour range as neither of those are determined by the LCD panel. We're talking subtle effects. Some faint banding, maybe a touch less colour accuracy. And given all the factors above, informed opinion seems to be that at the level where there's a choice even those doesn't show up. Basically, the LCD panel bit depth should be ignored in favour of the hardware aspects that make the big differences, like colour volume. The important 10-bit aspect is to ensure the TV can receive a 10-bit signal, in order to be able to read HDR data to make use of any HDR hardware it has. That was an issue back in 2015/2016 but for the last couple of years seem to have become ubiquitous. Although there are still models where only some of the HDMI ports will accept those signals.


Pmsl you were waiting for someone to comment so you could copy and paste that. 10 bit panel = more colours. New tech always takes time but no point buying old.
Post a comment
Avatar
@
    Text

    Top Discussions

    Top Discussions

    Top Merchants