43 inch 4k tv that is uhd 10bit

11
Found 16th Oct 2017
Morning All I am hoping for a bit of help with this as its driving me up the wall. I am searching for a good 43 inch 4k tv with 10bit hdr and it seems that even when I google it I hit a brick wall as there are so many different variations you can get so when you search for the above the first thing to pop up is the LG tvs I then search on here for them and the first post is always its a rgbw tv which in essence isn't 4k I have been searching a mix of places as I am getting the one X on launch and want to have the best possible tv for that size.I have the h3500 hisense and it is a good little telly but its HDR isn't really 10bit I am wanting to get a class tv that can really show the one x off.
Community Updates
Ask
11 Comments
Think you will be hard pressed to find them features on a small tv
You are looking for one with Ultra HD Premium certification. That’s 4K, 10 bit and BT.2020 colour space. Also some other specs around brightness, you will struggle to get one that small though I think you’d have to get 49”

m.johnlewis.com/sam…348
Original Poster
That's the frustrating thing as I want to get the best TV for the console but I don't want or need a huge 49 or 55 inch tv as it will look out of place and dwarf the room it will be in.
All HDR TVs on the market support 10-bit input these days, and none of the other things that can be measured in bits are particularly relevant to HDR performance.

The important hardware for great HDR is a good local dimming system (lots of zones, quick to respond) and as wide as possible colour gamut.

Unfortunately it's still a pretty expensive technology on the display side and currently isn't available to any great extent in smaller, cheaper TVs. Samsung's KS7500 from last year is the best 43" at HDR, but that's now largely out of stock and wasn't replaced.

Basically, come back in a few years when the technology's cheap enough to include in £600-800 TVs or move up to a medium sized 49/55" TV.


p.s. Don't worry about detail in RGBW at 43" if you're sat on a sofa. The pixels are still so tiny that they're invisible. More or less doesn't make any difference.
hopper_papa_roac3 h, 44 m ago

That's the frustrating thing as I want to get the best TV for the console …That's the frustrating thing as I want to get the best TV for the console but I don't want or need a huge 49 or 55 inch tv as it will look out of place and dwarf the room it will be in.



Why don’t you see how you go with th hisense. You’d be buying something not much better at the minute you will be best to wait for next years models and likely there will be smaller options
EndlessWaves6 h, 58 m ago

All HDR TVs on the market support 10-bit input these days, and none of the …All HDR TVs on the market support 10-bit input these days, and none of the other things that can be measured in bits are particularly relevant to HDR performance.The important hardware for great HDR is a good local dimming system (lots of zones, quick to respond) and as wide as possible colour gamut.Unfortunately it's still a pretty expensive technology on the display side and currently isn't available to any great extent in smaller, cheaper TVs. Samsung's KS7500 from last year is the best 43" at HDR, but that's now largely out of stock and wasn't replaced.Basically, come back in a few years when the technology's cheap enough to include in £600-800 TVs or move up to a medium sized 49/55" TV.p.s. Don't worry about detail in RGBW at 43" if you're sat on a sofa. The pixels are still so tiny that they're invisible. More or less doesn't make any difference.


I'm confused by your first few sentences. You say don't worry about bits because all 4k tvs nowadays can accept 10 bit, but HDR is all about contrast (as you later say) and colour (as you say), but 10 bit equals a much greater colour range, so I don't understand your first points.
MIDURIX4 h, 43 m ago

but 10 bit equals a much greater colour range, so I don't understand your …but 10 bit equals a much greater colour range, so I don't understand your first points.


It doesn't. Bit depth has no affect on the size of the colour range, only how many individual shades it contains.

My computer has 100 volume steps while my TV only has 30. That doesn't mean my computer has a greater range of volume than my TV. In fact the opposite is true, the TV will get much louder than the computer.

It has a greater range despite being 5-bit volume control instead of 7-bit.

I believe the colour range (at a given brightness) is primarily defined by the spectrum of the backlight.

The 10-bit depth in HDR data does help keep the spacing of the shades from moving further apart due to the wider colour range, but a lot of it's purpose is to encode the much higher brightness range in HDR. There are some interesting documents floating around showing why 10-bit was chosen for PQ:
smpte.org/sit…pdf
EndlessWaves7 h, 24 m ago

It doesn't. Bit depth has no affect on the size of the colour range, only …It doesn't. Bit depth has no affect on the size of the colour range, only how many individual shades it contains. My computer has 100 volume steps while my TV only has 30. That doesn't mean my computer has a greater range of volume than my TV. In fact the opposite is true, the TV will get much louder than the computer. It has a greater range despite being 5-bit volume control instead of 7-bit.I believe the colour range (at a given brightness) is primarily defined by the spectrum of the backlight.The 10-bit depth in HDR data does help keep the spacing of the shades from moving further apart due to the wider colour range, but a lot of it's purpose is to encode the much higher brightness range in HDR. There are some interesting documents floating around showing why 10-bit was chosen for PQ:https://www.smpte.org/sites/default/files/23-1615-TS7-2-IProc02-Miller.pdf


You are wrong, the colour range is made up of the different shades. Increasing the bit of colour gives you more shades per red, green and blue which equals lots more different colour combinations, 8 bit is 256 different shades of those three colors, so 16.7million colours (256 cubed), 10 bit is 1024 different shades and over a billion colours (1024 cubed)

So saying bit depth has no affect on colour range isn't true, it is directly proportional to colour range.

Edit: the annallogy would be, my orchestra has 100 instruments, whereas my band has 5 instruments, so the orchestra has a greater range of sounds it can produce.
Edited by: "MIDURIX" 17th Oct 2017
MIDURIX11 h, 14 m ago

You are wrong, the colour range is made up of the different shades. …You are wrong, the colour range is made up of the different shades. Increasing the bit of colour gives you more shades per red, green and blue which equals lots more different colour combinations, 8 bit is 256 different shades of those three colors, so 16.7million colours (256 cubed), 10 bit is 1024 different shades and over a billion colours (1024 cubed) So saying bit depth has no affect on colour range isn't true, it is directly proportional to colour range.


Those figures are correct for 10-bit SDR, but HDR has a completely different function to translate shades to actual colours so only around 500-600 of those shades fall into the same brightness range as SDR is typically mastered/viewed at (and the LCD is responsible for).

Still, if you're defining range that way you are essentially correct.

But the important point for me is that all of those extra colours are intermediate shades between existing colours. The main benefit to picture quality is preventing banding, noticable jumps from one shade to the next. However, there are other things that can cause banding such as image processing and there's no evidence that 8-bit panels are any more prone to banding than 10-bit ones in practice. Rtings banding test has them all mixed together for example.

Hence my comment that it's not particularly relevant to HDR performance.
EndlessWaves39 m ago

Those figures are correct for 10-bit SDR, but HDR has a completely …Those figures are correct for 10-bit SDR, but HDR has a completely different function to translate shades to actual colours so only around 500-600 of those shades fall into the same brightness range as SDR is typically mastered/viewed at (and the LCD is responsible for).Still, if you're defining range that way you are essentially correct. But the important point for me is that all of those extra colours are intermediate shades between existing colours. The main benefit to picture quality is preventing banding, noticable jumps from one shade to the next. However, there are other things that can cause banding such as image processing and there's no evidence that 8-bit panels are any more prone to banding than 10-bit ones in practice. Rtings banding test has them all mixed together for example.Hence my comment that it's not particularly relevant to HDR performance.


Why can't it just be plasma or led? Enough to make my head explode.
It's definitely one of the worst communicated technologies in a while. Paradigm shifts always bring their fair share of confusion but usually there is some clarity within a year or two.

The best page I've found as a introduction to the HDR formats is this one, although it doesn't cover the hardware side of things:
lightillusion.com/uhd…tml
Post a comment
Avatar
@
    Text

    Top Discussions

    Top Discussions

    Top Merchants