does HDR have to have.a 10 bit panel ?

14
Found 2nd Aug 2016
Bit confused as these TVs with 10 bit panels are over a grand yet the cheaper models claim to be HDR

I want a HDR tv for around 600-700 mark

Thanks

  1. Ask
Groups
  1. Ask
14 Comments

think it does if it is to meet an industry standard but otherwise 8 bit is also used
Edited by: "steevojohno" 2nd Aug 2016

10 bit means more colours and graduation

steevojohno

10 bit means more colours and graduation



Just more graduation. The colour gamut is independent of the bit depth.

I'm not familiar with the TV market but maybe they're 8-bit panels that can use dithering to produce 10-bits worth of apparent shades rather than true 10-bit panels. From TV viewing distances I wouldn't have thought there's a great deal of difference, especially if these are 3840x2160 screens as well.

So HDR is "Peak brightness" in a video. For example if your watching a sunrise or someone lighting a fire in a dark cave those bright elements will look fiercely bright providing a more true to real life image.

The TV will be much better in displaying contrast which is what gives the images punch and depth.

Most TV's will play a film with the HDR data but only high end models will play them to their fullest potential. Look for the ultra HD premium logo, it's an industry wide certification.

As an example most TV's can produce anywhere from 300 - 600 nits ( a measurement of brightness output)

A true HDR TV has to reach a minimum of 1000 nits some can output more! ( expensive mind)

8 Bit & 10 Bit is all about colour depth

Primary colours for a TV is Red,Green and Blue
In an 8 bit TV there are 256 shades of Red, Green and Blue. So this TV has 256 shades of each of these colours to produce every other colour.

A 10 bit equipped TV has 1024 shades per colour!

This improved colour space is relatively new so only the new 4K bluray discs and up and coming programmes on Netflix and Amazon will give you the genuine article, otherwise the TV has to use clever software to upgrade what your viewing. (The effectiveness of this is based on hardware and quality of the software Engineering)
Edited by: "MCYounes" 2nd Aug 2016

Finally for £600 -£700 you will definitely find a reasonable to good quality TV but will NOT have a 10 bit panel or be capable of displaying what regarded as a true HDR image

EndlessWaves

Just more graduation. The colour gamut is independent of the bit depth. … Just more graduation. The colour gamut is independent of the bit depth. I'm not familiar with the TV market but maybe they're 8-bit panels that can use dithering to produce 10-bits worth of apparent shades rather than true 10-bit panels. From TV viewing distances I wouldn't have thought there's a great deal of difference, especially if these are 3840x2160 screens as well.



I reckon cheap TV's will use ​dithering technique to fake colour depth, unfortunately for us if we want the real deal we gotta pay top dollar

MCYounes

I reckon cheap TV's will use ​dithering technique to fake colour depth, u … I reckon cheap TV's will use ​dithering technique to fake colour depth, unfortunately for us if we want the real deal we gotta pay top dollar



​I would physically look at the picture of both TVs and see if you can perceive the difference. Depending on your vision you might find the difference more subtle and not worth the extra £300+.

MCYounes

So HDR is "Peak brightness" in a video. For example if your watching a … So HDR is "Peak brightness" in a video. For example if your watching a sunrise or someone lighting a fire in a dark cave those bright elements will look fiercely bright providing a more true to real life image.The TV will be much better in displaying contrast which is what gives the images punch and depth. Most TV's will play a film with the HDR data but only high end models will play them to their fullest potential. Look for the ultra HD premium logo, it's an industry wide certification. As an example most TV's can produce anywhere from 300 - 600 nits ( a measurement of brightness output)A true HDR TV has to reach a minimum of 1000 nits some can output more! ( expensive mind)



Contrast has almost nothing to do with peak brightness.

Contrast is the relative amount of light the LCD can block from the backlight when fully open (white) compared with fully closed (black). 4000:1 means white is 4000 times brighter than black.

Fitting a stronger backlight doesn't change contrast. If it doubles the brightness of the white pixels then the black pixels will be twice as bright as well.

I said almost nothing, because a screen being washed out in strong light is having it's contrast lowered by the ambient light bouncing off the screen and making the darker areas less dark. Raising the brightness of the light and dark areas increases the contrast because ambient light has less effect.

Original Poster

ah ok
this is so confusing, so is it best to buy for a 10 bit panel for hdr...as a sense of ensuring you are getting the best quality

Contrast is defined as the separation between the darkest and brightest areas of the image. Increase contrast and you increase the separation between dark and bright, making shadows darker and highlights brighter. 

HDR affects contrast....

Just buy what you can afford dude!!! ultimately it's just a tele

Dolby says you need 12bits per channel whereas the rest of the industry says 10bit per channel is enough. HDR EOTF(SMPTE 2048) makes better use of the available bits.

Original Poster

Just wanted some future proof. 10 bit panels seems the certified standard but them TVs are over a grand

i have the lg65e6 and the first time i watched hdr i was like is that it whats all the fuss about now dolby vision is another matter that looks stunning
Post a comment
Avatar
@
    Text

    Top Discussions