bitrate on 4k televisions

5 replies
Found 27th Dec 2016
I've been looking at new tvs for my PlayStation pro and what I cannot get my head around is the bit rate. how do you know what bitrate a television is?

  1. Ask
  2. TV
  3. 4K TV
Groups
  1. Ask
  2. TV
  3. 4K TV
5 Comments

You don't. Bitrate has nothing to do with display, it's to do with video quality of the source media.

Original Poster

razord

You don't. Bitrate has nothing to do with display, it's to do with video … You don't. Bitrate has nothing to do with display, it's to do with video quality of the source media.



​the source media will be a PlayStation pro and to get the best out of it apparently it's 10 bit

mastakilla87

​the source media will be a PlayStation pro and to get the best out of it … ​the source media will be a PlayStation pro and to get the best out of it apparently it's 10 bit


That's for hdr

True Colour = 24bit i.e 8 bits per primary colour. 16.7 million colours
Deep Colour = 30bits upwards i.e. 10 bits per primary colour. 1.073 billions colours
Higher the colour bit depth the more colours it can display.
en.wikipedia.org/wik…pth
avforums.com/art…039
Move over Colour space and welcome to Colour volume oO

I'm old skool so to me TV video is analogue with separate chrominance and luminance





Edited by: "kester76" 27th Dec 2016

mastakilla87

​the source media will be a PlayStation pro and to get the best out of it … ​the source media will be a PlayStation pro and to get the best out of it apparently it's 10 bit



That's the colour bit depth, bit rate is the quality of the video compression and measured in Mbps.

Any TV that advertises any sort of HDR support will accept a 10-bit signal. As this includes every TV with even partial HDR display ability it's not something you have to worry about.

If you want to get the best out of it then it's not the bit depth you need to be looking at but how much HDR support the TV has.

HDR consists of effectively three components

1. A wider colour space. Normal TVs use the Rec. 709 colour range, HDR TVs are typically measured as a percentage of DCI-P3 or Rec. 2020. The more the merrier, and this tends to be the easiest part of HDR to add. This has nothing to do with bit depth which is the spacing between shades rather than how far the colour range extends.

2. Higher brightness. A typical TV is around 300cd/m², HDR allows for 4000-10,000 with the best HDR TVs at the moment hitting 1500-2000cd/m².

3. Local backlight dimming. So that you can have a very bright section next to a very dark section. The most expensive bit of the technology and only found on the top of the range TVs at the moment.

The TVs available on a 'normal' budget (£600-800 at 50") have the first and a little bit of the second (400-500cd/m²) and at the higher end the occasional rudimentary stab at the third.

It depends on how much you're willing to spend and what size you want as to what's available, but that's generally the current state of HDR support.
Post a comment
Avatar
@
    Text
    Top Discussions
    1. Credit Card Debt Question...1118
    2. What is the best weight training app for iOS please?22
    3. Difference between these WD 1TB Black HDDs34
    4. amazon echo idiots guide812

    See more discussions